var/home/core/zuul-output/0000755000175000017500000000000015146173163014534 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015146177127015504 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000215442015146177051020265 0ustar corecore)ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfԅlEڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W56!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zI7~U Pm,UTV̙UΞg\ Ӵ-$}.Uۛއ0* TQ0Z%bb oHIl.f/M1[O^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIog":ͨ ҄v`tYoTq&OzcP_k(PJ'ήYXFgGہwħkIM*򸆔l=q VJީ#b8&RgX2qBMoN w1ђZGd m 2P/Ɛ!" aGd;0RZ+ 9O5KiPc7CDG.b~?|ђP? -8%JNIt"`HP!]ZrͰ4j8!*(jPcǷ!)'xmv>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8/0%X8Ua0NET݃jYAT` &AD]Ax95mvXYs"(A+/_+*{b }@UP*5ì"M|܊W7|}N{mL=d]' =MS2[3(/hoj$=Zm Mlh>P>Qwf8*c4˥Ęk(+,«.c%_~&^%80=1Jgͤ39(&ʤdH0Ζ@.!)CGt?}=ˢ>f>\bN<Ⱦtë{{b2hKNh`0=/9Gɺɔ+'Х[)9^iX,N&+1Id0ֶ|}!oѶvhu|8Qz:^S-7;k>U~H><~5i ˿7^0*]h,*aklVIKS7d'qAWEݰLkS :}%J6TIsbFʶ褢sFUC)(k-C"TQ[;4j39_WiZSس:$3w}o$[4x:bl=pd9YfAMpIrv̡}XI{B%ZԎuHvhd`Η|ʣ)-iaE';_j{(8xPA*1bv^JLj&DY3#-1*I+g8a@(*%kX{ Z;#es=oi_)qb㼃{buU?zT u] 3,F3,#Y3,kJ3,LhVnKauomˠ_>2h-/ ђ(9Uq EmFjq1jX]DןR24d X?lm$K/$s_. WM]̍"W%`lO$"ew@E=1E8E衢^}pϳ:v?12`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imN4wjD|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5, BoDw_lv۠Mɿ7}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ3.``(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omszv0vȌJBIG,CNˆ-L{L #cNqogVR2r뭲gRkeK\4? 8'*MTox6[dB Z^-߂dsz=F8jH˽&DUh+9k̈́W̤F.kL5{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]ŌvOw!|=9k-=8pɟ>րd^T@O>^UR1VF20:d T2$47mSl*#lzFP_3ib.63>NKnJۦ^4*rB쑣:5Ǧ٨C.1`mU]+Vu,'Yec] `YvOwoj-25Hݻ7 lh0bSlbw=IhRI MW`c?&d.G[\]}7A[?~R6*.9t,綨  6DFe^ჵu; +֡X< paannV K`pC?fE~fOjBwU&Gᚡi`lR`m] leu]+?T4v\%\z7uk{n#NWj8)\[ ?yiQ~fs[4E00):.۽ G5nWpO?Y  ne ܮ6TnϏT:6Tn"5k0C5PdKB g:=G<$w 24 6e/!~߽f)Q UbshY5mse٠5_mW?%TbzK-6cb:XeL`'žeVVޖ~;BLv[n|viPjbMeO?!hYf.WNW݃yh~%DTt^W7q.@ L⃳662G,:* $: e~7[/P%F ron~$dƹɥO"dޢt|BpYqc@Ϩ@0OTmBLi0lhѦ* _)[3L`I,|J @޸XFF"4En.O]~Nc9'\~ѻ|ky }ov?7"I8hp A&?a(8E-DHa%LMg2:W-ŷX(ǒ>?,ݵ𲛾é5Zٵ]z"]'Nh򇋲rTG_77:0@Iuʙ?&Ԕ8e,S55܉֏`E4&ZcOm۟GM]}yxYl 0JM"dλ=`Yƚ^"gJT῝}>8ϵq\FOXƀf qbTLhlw?8p@{]otsϑ`94t1!F PI;gi`.W&&ף \98 y{͝Bx#)u62ݽ[X-w&ld)r;#Q'  yH QveJ=WhwS]֫l"]Јzg6eze;\Mdv!E]?CLC4ʍ@1Ssc;l?ߨGW~oB(ъ{zgZJ }z&/F wkߓG9!18eBe%5A%!s#JXBz-Bȃ82,߫ ~c a^ 5%Di&hWZ n193T9Щp-NC֤^pY鳡Śk 2` PfjJc%e0nx MSkƊզ OT_jX3&:e}@t;;XAsꍷ=dO6LIR[T[Qp-&u8~a+3w]_,;?.W(91^ݗ$uxp/Cq6Un9%ZxðvGL qG $ X:u06 E=oWlzN7st˪C:߿*|k.TD.Z[mrlTPzQ[$vJKL .FZ*(1'$7^mSNxC8Ña!^{4uE߰8_r _6.sϸRw# y'uR;$厛p!&*̽dPt ̿f3Nd0 ~n~YͤBoK&9<iw7U6=Q0+ňxR}Q\lZcmNii@gIe3]v_ r[Lo{o>E.k7%,ڽpzS1h3dqe,}.|S,`W˾,(9cqjB\Y%ƁiKF.Uf28ZAgy>y,d$C?v01q5e.U?)]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'_|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^ϑsU~' Ԓ f\itu)b>5X -$s?{WƱPz;| \;_D[T/BI GH8@"t*"9E[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?hy:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;YKOIXA|i{۶_P. \iZĎOt(%xWgdٱI+-D{~3;;; (@Wq'Jd_&s`_mR$U'"/B U*D)øYe\ɻM96ƾ*J%\PI[V1jQMYT=K.Q!R$ S&Bf~o Ne <[K U)& pHio5iAzߋhf_R+QGz2o0ٻ d_ͯGW?r_?+?b$鉎hl\THB/qʛe?"Os$W 3))SR6btGlɩFE=04 ${IzYV%C|2= uG;&P& f鮧~5&,p $;g(&WkK%&C-!9]*L|2s[:n_ޅa &k[%GE3Y}u^ [@D&q0piWy|dc;K H=˵uX}+imCXH//ro1@jC%y []x)2~wB;"+KXoR~ĹhvKV㨞\6&]2d"vjŒfg*$V QwWyp;bah;WqYr̻FOXd:-"u K,`w 曁Թa2,a/x05,}y3S? @qǓX^>D Te"qւ̪RH[z72xºC/m@.XJP8"A "_l _3Q!Jbj7 OPWs9 (QUȠ}|e?UG"`Ru]T't7̃jR겚k[u^zHu2rZ [-2~tVڤkRD }KJO>E[U#/E8}X41Q /q+]u,ZMMкBpK..Ry|X8O(l E%ǣgo`g Jdߌ:Bk;Kx<ڞ|{ce|=r09cRvpJ "~dA#zFKʗrƓvީP tG}ϝ]rIx1G C{dx2ϲ1o}Ȏx(( f=28~?aI$18NGVúlp?;vno-Ob.K9"0lzôq(A. n://r{2o/qkPѷFҞNӺ"=IS@Cyj'ymD['D{^0FuǼAE!^ (3ۍH&{n֫:^ﷆ&^\' xu"it{sNEۅdLMc{ M1?~S4e\/V$ ɶ׳:hM^ oMyۻ2SʺyƳ|yhCK&Ŀ+'Z\NX$} !޿zuTgХ A,y$u+i}-ݤ>#%.h\* a8huhKcl]^iG>7>8ߐ0v$j$X508:?YG6? ;Z,@>$;xzӹ hq?m0uͶx GЧv M8$Sç>&t\n~Ҡlˣ 7WD b_Y ycXm L )IVޜ5ub 㚂$\5u7H`%ޣ%t]H(J72YVB99kQ KkTqWxtGcf=tҝ:zL$N006Iuƥ“znDC_"NΘ,˫Uhc{'а729Y/;RΧ,# ?H\$]{@i[qrm2oi&jQױ(X־SFle6:-%(!ߐ ^y3 bcu'Y ꊷ+r!{_:-s"Y&bV!{G&c}{@=H-IkjMVHEҠN_2]eg4o* <<"0"s"P-g-W$N4Y3)hzu~)3Z<<H2bNiL%5E ht* Tj6WQk5-^w 8;ʚZ',l1~NCm:$JΕg_tXt « S4@lc@8E.7>`.7KS79DjۓGm sS@!hX@,)- \) YѦ!U7lDXp xyD)0\GM8.wm.MUۋҚ؜r|vmM4q\d)ܵVk6PiAf>e5qo#☃Cgt7"=~:"e b6˃Y}t–M4GʹU^s7e5@btM}); rMHgB&=9W"t%DciM#OjJ8W֒*76޷{8rWt 7RdUܽNLY0ogwؽeQVx&5 &rXTnAu+ų,/J*Qw=9V>taճ7X]BCْOc[e45%i&Zuy-*FfSsUs LM%/,гQDN\نZE\ѾZWA%&qUrކG,1Uq\i"*"uTޟEk* k@ohGU\"X7lJF09 {TV\UW*L &j]ʳ8> D B<74ٝv-Srk*B3mYӚ 8^Wp]&u=솗"8Nxֆ+0TCM5Zԃ1cCʃfݻ抰]*v׮aqW+M u D6RiMI"!\*8U5eBbnN cEVwh(E4dilQw&fCX8PHi%BA jgĚ1UԞn}x`S\LW]Z["b98Rb N5q{̈m +/<{=jVsZ*Xx5}?ɫ8<,AN*B'ZѮ$9t"\w-Uu\ThPTX+ x΁)wt}xVM~^MЮ̱ V, nٯ 2=ϴ%‹"nz ӊ<#a3q\3 #py!]dwMKC\ǎ qH?riVRCLzg*sK2I@5@m+iw8v(Ęn0^,YE#*-pQCM V\M]U]}h\g>˧L":Uo9R6LgXR4bRӢ!f:JV3oJ}{|qGEz!]W%FO] .r=7 8&˳`T-'jZ]U} cW~WlQ=ߺzዼ p+yZ,A䮪zLc`"`hd3Te{o3Yܪ=xQ,=x7__c=@UUu-T44Ҫ=v|x(1'ʌ#T#KT])sp0g]+yk̹Nejwr$d\I5H)73 lHr]O_b.SYVy K?_nZMibLy{ʛ^Yfʔb욦UMzFgn+Hۀâ랢]RT!ky?Q,Ua+c*3**-I#o(_I>~Z*sHp%Vrq#{Klu341>sW)}&_2WSwn..@۝cۍŴw`]' (ݒlPQܔ*߰: =A;eƐ@x`a2G0,+ MSmÁ P}wc ];72A10{Fn$CBUU˘0TVkÅjMN  ʆS!NDXC&B}i!İv.ۗni|ƭh&_ØG~N%Nd{ "\CjJ:;Z,{Q 'y)Bw[(w8QnUG㨒~"t?",!GH'^|Y27oӾNt+~u=Wo "1_Vyс?Oѹѫ_~wunL{I84 Iioc @h8i{a~7D .8 CXfij5+pK?ywM4\ XN8#˦ Qd߭7@,z:nR N!#Ѹx07)cwB\w.AUUު_5@o߯] ' њ+1и P4kϤP:׵,WFl*@Ѹ% @4[ ļVQ y{su81NG10(ٝt {@a&, ~h7Mߑ8v{  .nB"El7 A @nb aw$B h?Cr8\/q =wAQȀ{Jc7j),AHOC(N^ߟx0QOzօ! uµX31,b0s#?4zF=="VԒaݸr(Qx" ߵ6ejx%># P6ME-"8*KrvlFiFv}t8#Fϭ%$])Fv])9^ˈ1p[&hmڱ9e>APY#4#:8]Ju qlJU(" }"( wV#ƈE!^jwh =C0F}8 q-9ib}-r͎sP~wmv*!vFT[Qd=`ўP cY?@N.B(v rqh9!w 9 NfD#]Nqⵘ8 +ӡ$8 F(:cMՑܾq1Q?n ۯr,}}<4`0q#EBl"EB@.=OFH;,Sv.o5DϚ_p GTibqq=XUpUc:Lq0]6~](2h1+Gg䆒Rh DX47vԮ۝0,>l<.*>(LIut(qYYKr>{ɖ#+!Dڼyrqqt͡IbˍzMЎܞgE~2rp7$bCLg; TI_<8u,8pLxC p`>Zi1lj=?됾WΖHj,ȯ'iJX?ehIO˂F`6*2օ7 ˱dCvNu7Ö!A+n,0ÃUb&br@AxJl#rS`@-.l4P{}IKwƆy" \CoIoOY1;f0=,|_`xf0-*%o|_eޟXɦE1J%N <<=nE?UWjl6r>*@ђ:Nb,yhK4?qϋ*Mi衣a0jga6TvquԌQ=M` QDi?W!" 뭪NgZn-h[Q320-?I~po89$U@h8A ӯ L홶Ths?ňH@lv?7~ǞRj+`}Uw( 't0Ʋ+l,DEᱷsɭ h U>*8qx;cZ$̷n0ǫax:`MtzN2 f$wi@'lxdV{)3PX7,4TK۷iU}rx19G<Mw #cAcBo(NaV $xeA iӖ'ȸuj#0]!p{Lx2͆/ XʌJ'1 h{9IY)mg3tiQ,@{Ɠgn;т5M'raJiToI;3.lL9yZz_b?9rdg}43  E iák)2Ij`Z8Ń][:.K@(>:Ja 5;8*CW_~wU p:7t"j(vP͸_ʯ(6sgBcA1{e w,bNë́qwxduPxWZ`=P@ИʕH<2XpS T`&+|Y0b D6BGD=@5][{4C爫!P|PUXlƫEtu|iOS|: S&D͚%P`%َ2"޷35dx>[ Fma?t>pLRȣp}C303}O9G~Ϯg6e$Վ\W ~,B`OaQ`6Jc;3NOny3SeOMQXj?l! q]*z"XҾMN먨XlD'i <>.POI7 guH+gQS ŀ@h2B^ prrM躨f=.ykA[dR>(6*K iseU\eZp5mT&JIĠjaQGXD$AR2e4у\!f ڴű[jZ]o֑F_()KKʯ=_4OxAt|VZޭ ?ꀚ&J$u@EVEDC00^ p0Ηbpqǘex(ueeǣcn·U6;(0 >dSZ\CIC\w`kPmGCݐ!  lu#nW|űt`!R4bt 7qc:W uXuȅ!3Ԏ.Z{J^6 Cn/EIiYen'<z TI'g%nܜ\ӒIxf+T+Os#@k.,I:_;*9jvRμH eŹaF-\zndIpaBg::8늢 BV¼!%~ {W*(h&Ї&pc pSYmԴH Q5ݳY~Ya1UØbR3ܶ,䲺Ny#bB1vu̮kjcV|w?~l}O_#کTg=]ud>9ӯ`yTl6_ͯY'[rrs,i yz-r5!2ۇNR{ b߉*=wXBp~%==Wc6Y%dJ k,Jޘŀu 諾1ls\qzca{T._'#P2{iJcֹtyYSpmu^!PĈaG$ldɊ Ϋf#,iJbZL x4F``;lդ.7 Mf7Ny#SQ%Í`4#uSZxRq-}'U -XO%k*er#Ĉa#n<ӡiތx+[e: B`k}n~{cQo'BNy][¹ V(.{V?`$32\., Au6qJp YVn&8uNα^"Ohߘi^Y̢цU=%o-b$]-l{گ4x MC,U͵9x G 7/G;RZe\`V$8Epy )&8!ǧf$ I"Ӛ)C˔g*81.p_Q(XbAkЊ9A <n4\n3.IrtH0c=o߬B"A3,/gAǵ*|Y.&12R52Q[dX!1VE^jK1M&H36VѪ<'KJ;1.Ý|2+2av,?-Ka{/E9uwb$;JMD%6(Ļ^FÔ"gks F5xflZZ52 .PJA\~1?Ž/}n9;VgAx<0Atbƶj(">|$w~p]**d1icyxI8n>;/)p\߲O}=Padyp'Hh4&9 Y}BȀL4`$](}y ꘡8f&eCh2X ٳV& Af1._= .:%M;gɃcZ>Ȅ BLQ߲qkuJ">0rϗS5AX`İ' :ic٢l'NRCvWz;׉6%qpE`n¤NjTUzc30RTW=!^`X+KfYjRONz^ #E7Iyd?9 -kuc!z߫Hd=&6BHpҝݲw@z"S(jp,BF[G pK){~?xFȫPCk^?agqJKW}GN e^wbpS˛jb|#Ap87l>>|BPs78ң-Jso!*}`!)06}#Id7-pq x 4\D:haWU xS`T-וdCe9ƦS*FY5 ޾(Qx-4&,ĪrD [& GMSM/{dsNju+*FiT%Jt7d:;5iAN,y60g$]+8c~ 1[粄N` Q{WI8{]Ňv˧n4bX6OY,Y=zRZrՃ5}BheCtRˇ\/Y%Y~kr?ؼT-2 D9Z4Jic/ (HI5d"x~Nq؟nާ6xQ ^ݸEI7Eti1-1l}J0x\Ii5n *n1iX;KFBG lTzl)QP/aH7#;9n2"V r+p[ҦhU{@D5{v:'ny|L[:) '%OhߘiW GoXR>쒷MN c}R"*%cnuHIˊ+t ?}&qpfϖDW}@-'n Hmc?$ ޸:Оuk\JI ft,'VfH $xԷ| ׹'liTnpunӄ4yqU&,m ȳ[zP MQ"\Udrlb~;)zbǸąT uNJ֬6VEϒ{8(5pVοHdxs7'5c ~D|\z $|L dLuo_\-у{k!:I$od3O #F[l1-@'"Arp v$GښKW*U@8[ NJJ¥8)dHWf G!ܺ@SӃ3? e 5I$Cj*](^UKëAA3PUяLh30#tnxR=joXؚ #i'_gPS OF}z'j26~:rLIsi9ķv%μ.jj~)er9"$/+>P{ 7|8@Jl1G#r/npp, E>ܵ22@!e$kb/&s."sq7ogk 0:si9Qp/L;#ވ/-JƂO*t3~8 D3襹*7B;{nhҾ{R6wnq&d:;858x!&hq/$tPgpgLR ]U8nK# 2Gڸn:'HѤ3Sڔv N>W`WsMvma6$-!q棹*~v ʩ~GW:t?~_/$8R1ws_~#Λd琦 S3&y\L{C48 eb 0 FoȸH1xrqdaflM_ J_dp{Qna%[8_b6"~ 1$-xN25OXo*0M0 .'9r}XOH7#(y0- ^9%dL3∞Ah̏%]N% W.,OHL6\3ę0*py^4?j- d/>+:tB ~)AI#lgJӬ-42q?$^&"p#~ d]ILG%1W'bj:1eq̬$Nk_FB^i'kbp/=6X׶ٗGgYI3qk|ߝ5ڛEm£|slEd7 }Y-a@ޜ;8QެZf5a6  ^wHY0ewO>ƿ] "GY8'ld<a8ƣȎwXu;H4H1 Ze]Kgyus9\% MDQt; (]*%. F-z~ܳ"{:[vs{.:JڏǧGc-K]>@[Ire>=LnE`ruOK;DShpZa:u )qOQ1\M~V$e^q~L '"Nx.Ր>|7;w5͂x 3x!?d|\ T ؎Ͽd#ؐtl zKUO,Qff(Bv-Iޟ Z# \V.z t^Y~1?7.qfbJ4 q3bE2,0E3;sVXu\M\D^`8kŎb SNEaVWf@Wo uVHR_-0ܛG pjo)Nɀg\cMcsQR~\crc&WOf|2ғJN6YpբJ:8 ?[8gc.J kIo@HԮzqϛ W+O\%kUi忶&{*﮿J+U~!x}m*+X]ӮJfX{ɷggt2@lP!n,Lݯxh|Ѕ]lguu*Th;[XtRn{ld?W=_{ym|=֕3]Z9 ܙ(]Y98Ui)#0#!IGŕk1/8yK-WM8̘<< K(YXBjKYBZ{YB#;3Kl4yBxPozЊ \^ܦWQ3ҒXNХsq;lLlA#"g7bq(b;0W@[@6PE ,AZ=_o7_ 9e9}aw%6ږ{d-Z.$]&\hbw}j: d 9(j4_dM{O`},e)#)Z+@X<Q8*=)/7r4b7,>i!tݠE(YVх]_#2Ri,ُpn-ɻ Ž0DըcOʝ?/f/_W|r NN@;gtDa[A͛jeS!EboZ2O8,qg 7vyBjj+)Z)7<SmHֵrHvxStx;ou`) +ý:,!(RZ a!Ὦx3D1 ںR RKl{,Rq.%]1Pix2?\1Tgx`p:4=@O72}`d{Ohvh0fpf}{:g4W*0jv:" :j#TX0WXcbMo~9ʐNʍ}~zgGsI>S}\Mk,hɻe[A\ܝX@Z4Gډ|/l8k]lpȒb}{nI;0Vd͠Ǡqo`@'k(x{_}GDVK sa(NPƴZ59G%FP/BS ΆǑ tO#Z"vN(5a~O~)[ldOWL m5 XT_kq#!ȮG2532z#,:j*?",<e휣C/8p{^b. e/zAw hueuWڷ~].VHM[}/tXP%̶5)S&ۑ}4;+-Q`ѝ}qu XKג"ğZZ;Du D#$q-Ks-ȶmT+6<7' s_<`i<:d?~}J!\<8dyNyqb0nir1[>nd'SɑD2 8rpljR9y G'!,vT̎c?"qBo}uZmwx͠HX 5#\WjO-IL 4(^3nZA ƚѴFѴsDo3ju8u]:%CHW0ޥ>PK-S\Z&UO"Huc]oQ#]oy`zьuy]3 00T̃DSyg"j@zYB[X3vH3vfZmKߘWژwy&8&=lM)[}Ӂ BijBQ/ʄDKTC0g{ h4syִFi6`n.&Hևb+qٓbo;iv"'EFζyvq3^fEvR)V4 IɓW~'++ӛ$7tAD3'1hf̀DRl~8E { /7_';0)Kw&G' s~E,`/hKQ ZsS D8dCPk`❷"P? wJZ2I 0F7V:գ07:r8]NP]sUΕNT [n}ߌKisJ=9!.Yj7}ۡ~4Ib Zf! sً.K 45X; qXDs]fo4^k6loWօR >D{L{oD_L6> 2u qiKWX_\(G5E O%ǧ)BnKHbFLK%ۉa3[IcvFnHjQ^-Yfg6Ƅr3[@1Һ V  `~ S`vff [PO~(&][=5@>l#|5;>_ЬLKNLgأ&@ƣb0 iG7b#~ZRvy:1&mSBh4kߕܭԪ!/Nޝ_9.0sb=r( q>q2Bed {1,<>gY}91b6\i1C1C=t6 wmk ܄^`ai>Ma"J*gNlBl}O$Hf16B0„t]b> q\? cc>z3bLǠF* 0)Ir܁BB(0Uw9xg#(eXf;vDBϭ K:!l,NgBVi'V9cVk,9 mL{'!`v*C~:&Hu Ic:0KBV gč I&Z^q zcBo:"`T61o$ȷ20`b TyͰ8,SdDQSv24o&K#)rf;q%imh,-!y,Q/%M=Zm|*Asl@֡9ʦq5Dp`ϜVQ O; P ?%Ój2u7 J*Le EiML#P<7@}b`QHVٸcђ1 '%&`ä6TSJOJwVAvVA˕(0GHü{.2"Gr+'4v _`[ `nUAn Z"Ԕ$X/Gd@( dC0!|v>y`JcR꠹=(')iA t!NT1BK{޶% !=4xĥrIg\I\XƺȢ*Jv fDْC(ERڙٙY .jH>q+FsAt3WaN0J!#ցEIl2TrȄ:E(X2L@zI`fS2(P&N5B3jLԘbu3I)[%+f=f}Д#{{T"hVmY\&0;'^jA)1Yι֘P1ӰXְV1&莄"JpKkND-~ꑪTrO-'-mu^M3X҄XI4Q f £ǠaH?%8aPF3Ak,[_v:045U+ҠL;$`;#\ՙ@K+pR2 O&Rpp$'gf>>4EP ]qxm-/>3[-[ nٙ-o+%i-qqZZ:Ie7{8+3O=ܚqwMRvx B8oi{SM0L_=Pl;] ZEXA %iPJ+ "ZUCb'>#>f)Ek[K1vka[W LMQeamZ2Wᘜ)>QANR=e,xŝr+fe`vQV4|q켰$=Q{rO1uM:3TH i)NUG̪S"|DGP+m~+rC!?]+oVeY]E?4p7y{osiɅg)2ڃ|= k2B 0`aksdL VDo.?l},ԃb3˞8ii.`.TH\z]b٪˹u˯evyY>fM䁽ص`% .)W6ImC+BfP?{`GQ&uf1cU-KY\Wl2yY\\b& #ۯBդGn7]Kk{GW5x `,& jn.n=lj8ca eh{~~vexiu~V NMSPea1& - x6e=+>ܷ:o5 ac= _eO~ Ni0oC-OI0I痯^T})r6]yyqh^|Oi1eԪ=>/>&|J^υXywsi&2aVzEt^J@C?KcD4mGܝ̪7g۪&@ [oW`6zලM&)/K<`);\<X Ԁ^T=5fI<&){,WmGՎ-~0I#qXX[>dc9mG;pTgS9Aݼ;zb޽.(8N$ʟw3_zFL"<;{ k W\_OE6>L=!86 nA\$^EyqQ@Kxh8]{_Ϟ\*2Ř, A_џ)<V#scf3p+N|c#OKF,'}|8 RPqޱJ܎(S"T%}~ǟ|)',eU[y|*Wj=(ǸpPyA>Tj*5x>=N2xXXmZAtWe>C*mhOP5؉770'5G Pjx+QmJPy] t1bYzU=dxZVZ\”vH7P" kI攓RΩ\js*mɲ —*f|'f1XˀQ6tǀ|v[ľԅ~ Y-,76wy<=({JiO X֫N;ףPX2:l@vn@rp Ի1\%.+c F a.N .jLo_FvrxT*yYgLg"/I&g<yil1i"T\4ZZ"z"t$G@Sl9}65c*۲ƤSUJ־F39Xo%W\zn|.( Ktv/-aśfQv`$\U_xX;>‚喽$џVB7(*qMގ>N7̛8Ҋ!VnXzE.սdy\&KDTZ]̊?KM| A o-L4 EXPvP>Hno&4aJU&{{,ay4<1У㶯_r ֛voj̓ҡΕ#grNcpm6i0?U oA;NǚRDx 4)r 3,h"(kD{{_؛i T.kx Q*$`M<_>XY)GeKֹ3/\qz3ITuP{UYmN_w!ǔ2[_H//Ku|į- T |:Zoif~i7i~L@ I*a S3nBrǣ2$Xlu<^ޗ~F~({nf#qU?\yz LAQ<)> 5a׹$،{) vS cq @A#2Ptm)9yѾMf,7c[9{>j^_(/f_Ay@:"0y' lV^^_'1 ]|$;{$3E݌fj?ɖ@ڒ_ZwISY5Io|o޺_Wt"lr4;PpKa tkARv+W'>p3'~c|sOlHE^2aB}yt2{<ù h8S~ܳ/]wr98{|Vz#SCͰ d_^%1$>nV~'0;FQ6\ \>89 tͼ9d;.;rvjzoÕYb1Mpd~w^sH&RGyDBT)񿘏Nt+u塇0]m{eT{voF P}l.>=&mĈPǖ TnC2 7wt6 S.X`i42Y1;_Kb,=*d#OXYoVdN _sr{Ku3Xr8xș.gm+ẦkMFx8naF@4P31*p`Yt?'0b$F5YƎ%g%W6m%:GaRjHQ)En4aodU~g!aFV`R'O) ,Т8u@U>ݺp<Yg>X"0ث /4_G.( K7<ُA4A0[ztcFE{NgZ#G}z=q_Y<*frOAµ)ݯ1_U CI|f?7`XQܕ:,}t53B"y-/epXEz\(t\==5 _ ߥn󥻙om-G܀nc[ v0ފMyuwǦVìb?]?Yyu6 z֋ qە70>^[(ty Dbd#J7e]1cZ7}=gzF5- ] 0RGHЀEֱ@*qĢDXؐ T 8cx]],ȯ#8 S龛& &̳g94(!T)s :$XI&c\?7!)>#' θ~b8'/E,J}U$c%uv3I ~z]u7,g|O*vPҜ+ҳ)ޙk]nWX E /ܺLv ZJ9:9UXP7GwE0Ս@:ftmX'U13"{Y Ug:xqgP`]އ uZK4YgYI>S:Pl2 O^RTjQUU\ \} 5dfKEڛ1jhTDo,Tgp0LSp*y,i4tM&}>z>QPM\s+ YeQk6RHT1l{666t?o+K]=An7zԂ%I[0t X2IAzaI\(W( ]吏_|,P8O:)oVȮpL!wvc8xWVNˈԵdaKrߞ U%RREB-U]`| W W:}Ҩ_h/Iओ ;uP$\cՓ" 1"]IT'mAt2ny/XufftkZ⚖%ikZ⚖%ikZnЖ%i1%ikZ⚖e6-qMK\״5-qMK\״5=5f ZB%dh ZB%dh ji\e[i^" MǢigtzZ(B-/XNfwV.D:btzT.չRT^s/չKu:R{F&960~x{{/` VHǎO,jAw/w_K=kт`yD['L)_\!,*@`+Gwo#Zơdy^?@H&2 % ^PKG1w@ƉY06 z0X` TMm]rrAi.8zBCr¾Poΰٯ 9NS/_l9Ghq¿/FáݿÇ[/ 6*LL$ׁ8~t̑J(O@3njgپO|1ot(ᾏ|??~Jz"Fx/z1K_;<-l'Y,p .@ n/r8sI ~%,y_M^JODpFx[Q"P|l_X|Ϻ/ ^0#UQYi`W4L#';Gc+ ޗ^`4m?X 7-y0/$ґ#.Xヷ)qKNZO. ⩮ 94zye5~k ?tXZo\$0qe(&1ctt ljЖN(AesN$agb)S%̣]FENGJvj]nJM$O&)[% &/s=N|$U6ٚ0D)/Z`%h9S,kG<'>[.g[6mjn3F{|6ȿ}~|x&m$A@y38#>r5< 30/ >4Wx ɹf^1Ƹ*8񱠚~wǸE,a[Qq͒H# ܠHy380開D` k-X3Rk'Hp('ZHO08\M{&8 c. =dS ޝQzCM%dSJZK) ct^>3YCPqcAjVCO—t3@k Cx4qY&bZ9sG1wƉZO( =XeeK.^Hd0C0=^X"8_pcWs9&\:L{K3,@CD3(J8ـM1vʼnE׸ټkf ޔ9B15<իs%s:̔c GB =G%X5!TG1bM`44;J#y{T PT=_v#c˜(qu%Ītx{ qccxn, fO\;sׇ$@z8] tt/IP )}1M9UdYI' cܛBԞ<ē]?%>[9ݵf8|F:1iHa,G@dz#O:Oq[Ǻq[("0/ʨԞ1R5xЙM̹@ײ=2bhS&j:1OwQ#afLfw`N{h21]f 9mh`q10Q{ zI(L}8*^g#-œ;/5 5\n~TnɘsӡCs5# vQ.=c4|b͜./9H ^E"9kI )Q##Hk"s+I?By5N|,Բk;ubBx<P3t#(a[0M~oVq5$9..D'|$9ǿӗ:җ1}ǁGɀ Y?K>{$y^!; O/mbM =} F< FX鼙H2@jQ}`,9EOWo*fZ?6)~[k,Kx\ 8FXC9?H0gPw;/ϴ+'&r:JΗ\LbRb(>x>E묦֢aR]'njRo^eKO=E P .`;aPHM#qp %bǂ:a䬉DokXn1T+,60Ov8J|,jȠ_$DU8g[3aح_[؄b~*t0Fl'8-Avƀ6TrsW V'#7¶q=#*0lǢ-~Uq['*rGh逌Ld|c"ttB>a<͛iD}]h;6ao7QHq-m81$lDd3Z%%g%=c: [(>Ƿ;f`M-Ԁ(k :*L~`v7Fϣ9!X}7iŬˡ^5ae@S wrܲ֩Iza>R(U;wX܄=>~: ơ[yl]j"& ALPÝ-BcآGΜ)CgrfT顰E;9a݇S êjშ<̞1[sr 7QE\ j0If#=F`;I*ҳzC݂(񑠚A~>蔌*}3FeEv@M`Q\"xK*J81N7:7Qc!uo3JBqÁbNw]랇< U]p*gySDz2l uU@Q<`܄U%U|Za $29O;wDcGJ|,f*?#vrFj9~Ѻn%e~) /GUSNkqp^|-uIP/VLHa.*(vAT>`sH Ɠ|7 sH7av(oV-"ΰ6 `a!p=˒ʤ"lQr5m1#1 SJěSTBweEb1U;!.| mCj3Fr3c-K:a1^X&̺gZu~Le!.9eLOq8$s&yTjdsAgcA]wL9ͷb@¸[kFmN@lKmQm o@;_}0ϽQۋT` Nu@"4 3_9kIrE`RFdDZ.!ԓ|FZJWc5W9O7ՌQ뻇O쐅E Ib?0IuV1N;$BGwD|>GGmjZ{hybǭZ$ bh^%  3Ɖ tt J|,=-^ýB>iS_3 #@-],fӾ.D4P$1 D J,QD&J Dј6f-5$G_Hr>`|aITaĕ^j_n=#1hbWKsQ69#̸a\?њRaKNg֐(Bek?֠ǎ5$ǁ_H򈆳?#*5j(0*MIm'K,YKGݭF[# R?$G :p8`m8-cZvq nr{8P^#l!ӁϘ9Tvn31Q8-uO;÷WL)5MM ΧS3bflQOmG1D_&pCf'-Jr#5 DVԊ'\fG= Fs8 BD{0E^ yNԻBX%6bF_J˻zx9M*'qnʏXeL*I<ؘG-QR3ecRMCt1d/uR\#LQ&MUq5)g&&r4Js}t+_9Ljϙ̊ jֵsjJ2[0}E82/1]PmK&sRx".( j -fWvM)&,Ń7xfo8^nݽ2Gt@xX=RnQ%5G\̇,?><ə1$2F>WJZm]L̬;Qň5o1 B363YQ^D|U#\Aܬ|"VqO"WP/|# Ёcē8K‘IldTsg=s61>lGC \Jջ̯0"A1ɸ"oqx_Jr· $Ӈb/~=Z} lu&EL:¦U8p??=r=Py8FJ݈(~#Rn ؠ *xa %yF5G JԿ4 \8 (Lv)*-|o֢s%wo#ib'DNQ{Cs!i"bbW'kh=-VZJE&s3 E<%_ywӼ Hޞz,V}F<c)Qr rӈ('c{ae3mf Y{NoO S\sj1]o6{×%1;.՘|9mn>R;>@6j01SMrs0y*"00NũX[{>xv'E2>x1R '"h )4&&8\=Y/6;||r?KX<$vf(n˭Y'Jh-ՎӁ8¤I5֑4E]`dWW3k trД&3M2 M 4]&51;(f!ʸӎ2 3?f4Xΐj:>=(VE]B8(9y=eaQ®Ǘ!bMa6dK,uNIF6'jmr)(Iv6y jlAB2. f%`tJkEr2uJub2$T{|R/SA&f:',Mc8= -vF9g['8|txiG TqG!ɡ(ע# JđO̵PD%i0Ks.?Yeo<V#TT,Ɓi/]~\/^lĹɵJ|}R! R,iBO-:UҤFYWrRbȖ$]WqOWz r&]v`*h~ Ս"&Tc:Sm<ͫS"$PaG䊿.' =D$AsV[Yͭ+9w+k'bYZ[{VKSdw~ϟ.Q8Ԗ?PV+WP߯[WN{Y*>-]'oem?9,hfv vr$'/76-_off]`]a/&w~|l_҃[5jfKl_wDԸp/t73 ߎ7?KRk|ؘO^v`XkԐU?#]njGQlHgiǩw\$y"DqiL$r^r3N 2y?@χ9>s E|6Χk82_lwnRS Y%I&/l;7o'k|׍L? D.%,+u~?_<͛d4^z~+\寢&s?83՗>6/vWyy7_2c֫"o;/xim_nR"`mVkȞ£Y]m[nP+}(&v9j{HUv 4+;:{.Z s-2w" ސ'}SoH~&?}@Q,޾e\PRDoa+@/6WBI-'AxkP?7U)g|hbٲ`I(^r0meڀg!E@5{l loaǚ2Q3vrPxz@`ۇ0'ǻ~ PF!x3ߙ:4RCTl L4NfN&>$c.=G~wĹQ>w~ҕu5מ{PL}T?+F'f\5V|HKx-(gDF! bPh*L)q`fŖ%oo7,&+rҗ[^lQ%3!q2=&}svIE$_"84)}X^mwMǒT̀T7D'nb7y79xM-)uHu䛟4`Mj g &tYv2 _fu1Y V/^-@($!*zEWu&, z5d/}|D v5v$ܘyb j75P*F)9 =A%8"~)~DQ*CxxGDd&zgS,KGnD^ytQ.-`Zv HҐq(^aPJ820@aVlB GC#SOۋԐN~CWRp .jfG k44`v峈NIyp#,s| :q׾+,I@2ǫJaoe'# 2vțӂ<t+܈k+BjI(erP2:XosO{;nCNŽp,bV0"8*SJ)Wgj&aVcO]%qJpכ<(*^gLdƎ\*6OVAoYWުm\z< Yoo8ͧX)l.m\8 5*^۔"&pF'B Ʉ!$F y y fV'4~Q0Y=Y2y]z^;IJ?{2#<$KGM8a,yV9IM2W^(*lAIp4\2 s?Ìn= 9`Ij TA؃-/mK&h;u~o\DkF7Vg Qd}.hڴ=-'Q0:1ks  {DzSڒn\2T2F8TZ#5ljÝ"e ~V8_`0)dk:L,b J>\U q7pT۔%8W\4$zl:TJ!r8j?h4GuL _ӂr!r.:7;S{Z位[}3TNZTa -ɭؿ3hE[γkhm&*9Knx;9w% tdQA N%]k" Ҿҝ 'qU  Hzs8o!v;stCPzN)V ]/E 녌i!);=!|j!@'/MF4aȤWyķ%lH^p3GaݽB.G! GIps7sN.q胣K5Q;Ў dSz̨`CZ>9T%=5&jQjnܻ+5d$KXN=W; cT* JҀrpfAeLXQHÑlYK {^efu;(HU`N$kTOuδn24"bn|E:\\~c5 5 L`дD2NI S 2E87|B/GdY`M2eLRU`;vҕN*["<]OXSdx1 t%dre*a<8c%<#-XHMjb!/yoז_kam腣Ɗ-fGc !8 Aɥkq^&0Zp̉-ėi22*LqOn]q=6r kZ1P$4dôeL2,s / *'tLf&Z"p"|!#d >!X=7į8*0.+7Tj2!4d?DIIW2e s>[< \!% liEڢQZa7pK@?K֫Ǽi4>dz\~ZSTu,N^绛^|@$IZ¼8uIE/rn\kJ= @'wt6$9q:{#mov(*Iݜ‘G̘gw5 ov?[ܬ[!^>[ שⷣYWϏq1Ϊ"\aN4aʮY&DXRnlג 2M|WE< (Ak 0V4d$< Y`➂a)mػ\Vwj]a`=Qd 0$W%%WdTeԫNh6ɓ腣O- )0i`[9.cQε RDk-E 05p4e\dCDڲRd+_nO_P#)B'/MFZ20f qXxg ]L7pT1ND"" Z>,&>9^PČ H=~R?y=F}ejB:ԄZ]e;b'on~J+xU MXu[m_O-Rjwdp2H?; qx鏵!@k+WV,RU\.WQf0/b>/Œ؆ae6̴`U1|ǖ%sN$ÿ}7?ꇛkO^#`O&mgBO*.?m}"NTAMhpĖqarF˽`LqoT­d |P! -o-/-ߏ.0\NF`FrTh:ύQ0fruA[<)1?Q=WU;H܄}-/PoalGxF@$| h0,ս7>0씒2z+)cYPCCP|YCdU2jS^@'\MFpԚ3H%؀;aR)^hRxc %z/.uu_^Z?<2oLKA*d>|;`Ƀ vqqҕDA&Hi@2Ļ)\4Zmމ;1f9$  j""sȌF_* yTcpywS 3ɶ 5W|Y?vWٲ[,fM|../v<=3mE?be/|hsݪHřTI>ݢvC{qޕےI d#^,gR̢GDj1oG8B9,P~VVsRRob5h"^`t{b}^gIumz?eQ||9&u'EeMW~dOP}z筛(Z]ar5 #Z 5=_] \K:xt ZQ7yz3n*r_7u[v4+mwf)_mc^>{7_[|xZFsW߻z3%ޛ7x~O,d^v^;k{.O; cݛ@e=(M' jbX?ǏUv}k[ yUq&ey~.*$*뻋1=^#!SS7ifFHCڜ,d"k~9 90Yqֈx;*躪˷7co ٪eɨ*棧oN#z N]@X&쮫*[>8;d՜?ͫ9c? /Und8 G:L3-/T.̻џӷ+؛DCٌXKu[WKI7־]gddHB+%|'۷e/ء.D> ^{: cy6'gHE1VR1e7;k'|:ynj7=̹o[,I5Y" RP$%. L*iTЂqd]e| ХQ*T gU5ˣX_tl4ŧ8 `e#p`0O&Hoa;gHu,^w3b[ <@-D-mAON4BQa(tXٯr0=Si a>.F:ȭ4V~_*|zE`t6hJ\GDN͂%lK16_f  ha~=vN=vы`h¦"J=27KvG 7i\"%A˙єӂR("u#˜KC;2bܖ[A cQ<1c+QVN6#A$hsNh yV\8񼊡vK&`nr:bݨ^w'UܹC/c (=D*7|ڤnPT_`ĝ}-s`7iaH9 X9.cL86- ޓwmI_ ~? w""}MSE II߯zHIȱ{a60(9Mwj~7//*XUrqc~.NSNtk/] WTk[ub76]);_25-'<+ F("@lHND*b%=~Ni;%̟RaaB0[la 욌ooZ/{^es8.~+;9qG&d<;k(8s/X]_E8_N0' G I7[2ZI:`A(sm~K8@ =y[-,^zf?xoߝ2jߜ|z&rذ8r(;6C(Ovtȉ7%zg*Bmwnw5{WsX7l*㹕0"\ɋ Hka 'aq!'FP9u*ß]XMGWql\ PRj,3{:%"xbA"R0cb)|qqzW۹7ѓ9ʥsg=zo¶ahnSR!˒jm9zi5R [)wrV8L;tvfhgz=ə6>ln1ww<~~}=uv??u}6?mW;᳭L|V. y]¿_x B/qCQJm*^$rQNǯûvS!W%*Rp(FE\K'8wI1G@T<U:ߨw1x~8_߃P*Wr@z[@VG2m7 azR/˸ Tґmǣ$֊2A\n@pH24H rOf`lLD p6VHQ0ioVN^7%S`X^]Fp)$p[Ѥ:8E98E{x4ȣoDuoX>2mӤrO6]I luHb8^?4J>n լbr0^TvA[x_(džQNeKvNP DHH*@xY4 =CJCץ`͹/Gc%sd'&'&PMA&MO'~WAFEQa./-Z'MeƎ ;>ft:Y1D#s@*Z99KrҴFvs7"xܥJ@Iz^my4GWڇdž٣y< ?|6;ʽ0.=8h Ʊq[Fu:SPRNH}yw?^.ﴏVSpm6\2 焍 /cM[m7ǻvy׋ߦ$ w|B<8OV勨(ã18z|09(&ap=2yVf\ЖV(m=4x8[ ΀f#),[MQgLs |e!e0]_|`xgcElvb  ĖGcpE ՉYݓ K/h Ա׋\3˅I07EY 4UcUW|&EmnZ,7 :DR:Ȫ1ZTְ춒aqa0*$M!ah Gq _V< sM@с# ELƪQ:[mxm^lDiHfe"sM^>yBL8][$m@NJ]ǡ16v[_i˷7DX:[ă#5WqmyGaƔ^&Ys+z˥1FBNJѮo@31qlJy7Vwä1@Ɩ(? iwprQʖf㇧|$փ2E ;K0@l+b1zW/;TyI}SqP|L'ɐ;;t`)UKN\S*'6o-:)C1̜9q g=S=rĵL.ޜl zi{Qlw`]"@}pGV#;# ulbHJ.OB!#@|Z& aCC4ԝ\QH?\]Ren@-t{ã58jg~qdS'Z֠\UBAXPTw` 0C %jVEYZ9J׳2 (*)UV۵X8Q¹ RlXH@ ]4'=<kƺeDq^,)`Q{)5 n _-.lD`` MIn xq-Qٙ iRdh Nq1@{xa:z9FvʂN8Ab̝p !F_ijeuC27ƥB/͗4( PzOWUNHKqly!W"G[pj\ۋ!D?~ m! "=<ORZ4N|4YVgt~7(*79̢14 @H`Q Ei@0OT3wIyã18Q*F#d1)!rsO{G[pTa(\o_F<cr .gO M_RXQdWX) "5-:"6wڰ8X(Oy"]k Zchq@8@Uy7_T57)>M ';\Ibܣ i@wK#Tcu,չ@* J#V;ly4jrcxy\`n3phڗ6ƱB"ITh6۱Ktb@ePvWu|`?mwó67FB\&3g!=S?A-/Jh 9ȴc# ZҠXs~)o7zF EܚSX`G'<٧hdQITFdkIը;H:hr/Qo SEqC]"y<݀ݢrmdy)Ei卜{h.EEe%bF?YTT9"x k9QGcp*@u(4РhOzĢQ=<#vq;4OHYL]Ӫr/}!i/0 6kj(RIU8[)-E KCQD8iXT82 Bg=u:x#Qo '@jP z;x;~g Qg $)'5Iw;@  .21n!+L?bWVưTpwh[PFn t'u;#b ;YKIy1#XT> WTã18ZnEƫ]߬~3{ٵI8[n;y8?ޞrnz h4hs9;Y\.ߞ ˽.%rQtcMo GJ?O5[o(LrtW?.ÿwц?̿_O5,.W. S }t|྾fShjWvarHbLb"s*>|9k?\tv7}3[fO|vP拸QK,чnwE~e<'3(<~?L`Uod6Yuf:=:gHLgyf gwqv@)H_ȝf6ꋯ~nwv5v_?{u#fvg>_h6+M gjYoflӳWOAg6i__vw^{ q>[|W쓾ևo 6n?\Mnw_&|Gpk7zq?et~ouoR\ LTg9:Ͼ3;=27g[CVٶo}Në{&]I=Y|pQhKd*$ޜ^"\7_^a1@ 7+}rHJ|Kk>Xu,ÚG9衑dI_VRԚaPYU!6;Qj7 ccpK0(iZ1@A:% fB6%~瞬EF+"=W, guHn͗tw#@'uO+hp4+cO_*[7I'_cKlXZ/W`!K|!kz c^ǂȔfshWO? !rF(BR [J "~ ֵ}kSumCpmd =!"=AZ3RR%nEpM$GI4 ׺ǺsK+Krg]r[N S4 omvO'ӟ)i+ڏEg6ԝ/_w](Vn)H /Y;~qr朑ܚ0\/\DӜ$U\>\yj9v*C{csegyrprG65cuܧ< hZf -ӣ:ZpD*4t^>%.t*lcQ'h=t“yŲ.d.2c|")GD ( & J)ј=2:_u w9^C39kpD#RKNcRɜ]a1EVc  B' Lx/wSy,*]{2 aUݵaxdJc:j.TTtv(z4&f˂ >wLvB9b 3 px;|Rit z\Ŕ у'KGm7.2(#@0ͿxMHS(tYDcM0&[p #-Qusj1vkcl+~ ].k)ZdhW릱mjb:E?}8 @_p8ʞ|y΀1 3yuV>S:!muA[]VmuA[]Vm[[]VVmu.h,1NZ k1(pָO);t4YDR1z?ߚWvRjK|gA7i>OtwofG} ,b\d"(?8W/QCd$X+pHL'C:#Թ1u3KڙLw.;ٳJ`pwtb K [@0K0;1 c~A7C,rm>ʬxXg-'XtJENLbϧIy :6"8H߯@>;\ﳕ׳H2`ŃWӏ^,ppb0Ko%eR* ?^MM| C'+{b|{OUݐn(,[YB?_焞\ٻMwL^+A{Oת ue\BRñKM_MB6YR縃a ׍FR;͋j2}vчW/^*__Wo?`>NjoxvtSIqJb飇?Z⦺]Rk-hY70~7Gk)\[B(}xM\:뼰$Kj6j=Ͽ1,7ܬbT\zܥ/T"]F6wT_~voTV}Umg[V}Umg[V}*A{vNz#D"AoiOo99Y;>HCM# $g\Kr1rUyQh(/joqv9d8#9i`hn)1H-h[D8\Bǩ"&f,Rg2DOiL.DpCLDK9 Q ƵXD6g!1qCx~eaRܩw͗/oUMȧ?ySi+ڏEg6X)H.˯:DS銉s XBePpլz&ԫ֣.\"ͼ8OǢ66((dQݤW8}襣*wdSu~٭jc9 8[.`R_ {A/=fB~Mǽ0FɤN+ٵ3,$:;w/g'C$JR#-@L_3SJ`-0eͽP͝Td\:=.}]Wr]02YKeLo|YBEӔ%{ڸx,,ӘhCUU 7J ]9{}4.oJ\];ĬAfei{cټ-i"J"uhʊT1ՊH!jȊlnEƾcRfNxB9ZHH&"1ƶHSz" )[WBv Y=Qq&u:yDb0D.E^_Z\_k\4N,n( _F-Gď4/i6NW}0wD;"äD@bK<| S2*T]b~8ת̆/?x.LȁsQtaSXx:k"aڥ@q)ӍĊQl6u<]-G~~',|j,~y֭u.ӳw_G^K[/%`cx &bE-1ħ;'zcvG.J/3uM5y45DaA( C"XC|()>) Tˠ6/{;a)P[CSizڙe֩CZPG"5&0$(:J@]Zg,'H`\ʏJ"p1bV[@/GGR& / F<9KA̖&a=BўXR(hH4'T,A]zE,hzιR!4FAS+q4BBFQV]fIQx8Ff埞G@< fഏ>8±[$0H ( p5TPJ)^Vۗ\3a02*%ꮅ#CTcQswT^RD ҂ʂ!N=rkNgej b.Q . $S:#rH ]9,splPq+r|y7>1}JѕDȸ}oϢ?O]=oUGxf|gAٻ6n$LljVMjVs" )9V)EPe$;4Ѝh׀5 yoG4̋FM)5HD{] xB4Jm>UGDC@T1Db09tt(4l8Zڝqzcfp 6m˱؆ #{{}kn.lOr:`p|y~q1)WD˳ip-ڪvÿf)$v쮽8sVuܐht3{`vOm?xuu~uxx3E̹8/g#jd׶xe[No9gqcMl}t5tk,7qkU lq>gٞyή=]sx>!b[)օ޺MViYuKą P'>7}5|^4iƳ*oGӨzGtYc/ރ:~ѻ'|:ջ}^~P'| W0T-]0݅W_HWǫg KI=*y{&+iC_:ր;mjN}t}2>ȕN\g3 SGn+m0eF7oGC >Kg4Mvt*ѶmgMR̫F9I"y=.@TpKdݓc BF9, Ԇ5<-I$ b(v}숐egO(z;!< ˱ R+C- DQS-L'9˫=շX^L'ng~q̳Orl{lfbV9/Ў%a)J#o *n$K5E[z:(ȞMU/d2!HJ:lweM`|f X#$j@ɀ2L vx>f$®)//y>/eWmݯ.i8`l֏ܝntt 2>`[<;{w/)'\?H\?αAG0,Qς~Q҃='v훊gO]Pρp`|{deaԬ0S#^s7צzi= R+ت 28VXiWX/zEy[~_ c„(= Y%홀XH'ʠ'^E<2(;OK;x|?:ƨyeUh-dZbd0Z`=Qˤ%zλ؟9=N&|+r2c0e7SQk,tXs)h@IbNXt_φP_Ew`2wGfS)$ `h*e F"IJ܅ ORn)1QD kw FLj\II޾.MPy7"KSw:t_aTdm24}@uF󐽼]=_7bF<'R9pGx2<$EeT^֛$maύL⛯Vv hkyhfP|@- edjM>V5y42@Pqͨ"N*(ж\Y WZ!#A*c0?\6휌ST}+zVw*g}HβQr% PRj,Sqn ̙CNXаV)1Z**ss=ĵ vt;F𮏳?rrޖnэϟ қ:+{g`V}q]濊D/+!2&gƀ.&1xh_x=qhsSURaꝊ}QmEVqv ~ 9玢GSP\@Y"GrTh˯@^~.*UlT*RVJ[+Ul%T*}fEt sSp!ra#Q CÉ9l"2YEсI iE5TIZ+Ik%i$VJZIZIh T92V4[f+leLi2>%!LTٚ^f+leLi2Vt4GGQc*lEFi2ֺXi2Vي$W>Tه*PeCrYeC}Uf>Tه*PeVه*З3>Tه)keC}B} AieU>Tه*PeC S= gU\Wه*PeC}U5p{z7&xoч/&Y$i,52HL_4"TWwh 3!J;)H`!:A~> +/}Ԙ;㌱x=%ݣ^\v'xگ?ұW\jn mov7\|f|C8SjUpB 0vzf@kp}cv_̒O?a8Sp7 dy2fJ8I!A2rK:XFY@` /5Yڧ|$W"os]'0h/FŴ"Ц3~jڒQkяN>ZN5/ËO?]k^p_|>#7 ;_ՋzufggCA&FMZKU%f8m<Տz7 @m4Kcw5F$Ȅ(Q[_Ҵ^r<9-ܥkj=k^v+} {:(؜?1029UG[ޱɀG\R>sǰ<ܭtJqD8Yu^iwɂ{dgGO]>f-*x?QpYP;cf!NOx36w&URբ N|:O=z=h.T]ۇ|[,(=@re~^$O1; 7k532Y37MAJȎc1>%}hySh<]ԥwj\?RםD&%t!@,*T]}:dPTp7 N ?Y>^IǴݳţ Fm Ji_^>k@b4w[fXU[K6J,.O\2%R5e\Fyw{#zQn,`>w}^:HYEb2덫 qk9(7O=x/Y*ydn' Yn}nES}HuӀ:ۧBl.GucBaZyfj|Mgn-lu*̈́AFGf׷Rx-:..N%0L@KgL^\Oy/;mN`$n-[{ٯUfvg0?ZĴ/R->mf*/Wt54'?|7:W hr]ЏyQ60ˮd|ZuX_GXHchT(UU*է/Sg"uKL}j#-fW8|5RFp8-߽#Er/$: evߣ" 堬t8:n0N( ~"83E-cw}J/./R0XR2X#-UD P^%|w E0--b҂'#,y(,Nb}Z(!$%S(ߌ%rJjd A4hrmA(CԸl ?F{&:lO %4Wz)v8pHjF< 8߇RO EWk'8hG<6ӦGn*!|J(S ) RJX+!@,}ui"(qq r'BU6ߧV|.hI\Fl{5t}Z("溴$h Z ,WNYgh=Z( <ÌBLBB4Dc2@rC-Q\1 s-!@ϨtߣsEK/x l8f(q ^I$I.iB zu!9F QKXMXI>-^cS|a$U%`,FYneO %7Z{<:bX3OK%|w ԕ>&0KH`6$%r%|w %gfW(8 ]OCLXUD"e=Z(!<|8.9NpފM0sh܁PBxxFwം,`5rZep~O %׬8*9 UV9P_ !ƄtߣȦ;2Ql{"4CL "s :>-3Y\$ -!@VJ <lߧ{ey `YXbK4L[>-^c,J <-=u!>V2u ߣKbh* $ i$(.oJ`X1bB s Eye94/k~!Z[o :al!PU )q ha>hޏi.xVrbރ^,׭ LJ7nb@ZCG&V!MAaglB y#q$alSE=Z("ѥQ"| 0vi<\ðO %y#nZ$qKVY]oHW>Vs`2;`RDɎs~դDQIQARŪ_U?q| }/)=y[iR({*cڰ·Ї{ LT~J돘SD,>m(!fw)Gi$(- imF >$| =d}#x.m'>ѥmOe)#0/j.?J.Ǜ)VىcIԁeBۚ!}VjYG_OsU>l'9zjx ;yVM.d7x"9Bspߧo<`ZU: ]> m%jӚ=)c}"T=oƲ2j:(ߞm4Ԇ?V7)=|]zsn>:p5oUom!$~^]L}:)zpbݝْpRpuҩՋ~SUms^^|]0pdLPY}&T@\XIpU UElUfT]e|1U?L^o-ϙ.xoatMė~*/l?~΁.KTQ]a*xYwQzdNWu].ub}I(4MO5|88i+% bM/4YG8,˴ q5+5 s^;8PS@(7ͮ(҅,. ܡa9$$V8m0RS-yA)@}.}l d!3"K>~y7^ųd[i, idTaax2_HR.!=r]Pݝ/.z{˾m~zA{F ~ s3 &-fWS`/&A)٪>AG۳!D{eJTV$|>-Ib"|s4[0YbT$BqѤF8rL/tgUj >-H܆ڌG a)i>e$#o3&BT9 !H#m(MPyL65>bO-Ѳ,%>zϦv(9ڦ"a_p$냳-Ґ{-u-3g۳=$ <>7,7`ưT*] NF0:lbboiU4sҾ+jI1T)b+t@Hm ^ wE%Kǻ i){*YP,/#PA_=C8GE pœ\*[6 W\|0ӧ l~sٻ0iWW}2omyM=ۂOA7A7,w #X=`6 _L.w )K.s+ v'w{n?`N16oo? cko5ڭmځE %m53!R!1GT)cSvTکX]*kY&&VQR`*ƌƁZ`1b" |j$:⬝ 4 oGSǤtU{39K,-Ffm)HolMb?AgzR^'u4'm+23v s~Ȉ*D6ƒ EnC>p]%zpI^cB)7j<[֑aYh;X++?B" uDqw핃zm?Jɥ;Yvqd2pFDmMpҒ,Gi i3̰M{]z%T8 y JR9 qJ7IJr.N%0lyyo^>q齘y>|m^~}bsu;_=5)W9R7YfҢ Q+)coZ %tVks2^w^ǯ]JSfվȁ\>u..we?r W!e&@ 0:mZ@B| 赌FMFS9wѪa ngQfG)MoK9)V]moAHI3^U<vf‡ˡHܚd:/iʕMךɽШrZ65 '^ ؀.&{"KY~Мyqh~ m͒ 󔡫~lus|i) og"pyI`5D;+eR1Sq&4;˜<NME? (ioiY=;U.<^V`{BZτ(yco(@&1ޢY3c-ɂθ8M>Mi[ßiV~f|頡 MAjZ -B))&`[)Q[M*Ffd j0D^l{NҺ62)7cQnukUw |@SXȀ'ZK(I~ yI"R:"-GN>|Nwda iU30bP |@N@0ւc}9@AӇKldcߑi@ :)Bb%#1)DXҠ8KFb1}؎NvcZګ#E_E-7{} `^V?UG(?NMw_xdrJ3J9} W0~a;7>nҶ]OY STrbQ'h/`H?0*6,?6$Z>=OQLZM`v!2_Fy>]T-KeIA>w1/W78oBTvMoծאX|MSUރOJ-=koFE; 女]n2&Af $ 즭YrD"}2ERjRH"UMֻ _-O5wd!HZWUl[]Uc5 1 >߬֡*1W`N~[s (|Py |][͎W_|~_~@zۯ@Bs:|?~:0:W_WKu`O]u}jr˺<, n*@Ꮿy*W6eevʞm2z)'~l-wilqS}S% LUiҝH;2ߖ~m[[;cm|AL`u P:6 BCL.-I`"LAH)Ү  | 6:oϧs8]-(ٷ|&4+8C^Xn k oz퐭`6AX/q)Rm]PjAS(Mivdȶ_w}*K;1 $u,P[Ї,7:cоP<mCU{Hr^+0Sn-k\><ߝwC Ax[V>|vq}DڴPh+Cz 9}Xת{Ț sk.3{;aTG$u.i!5yN70ז $N1g=d.|q&iZ217cۺKlA4]X$IReDLqM#T[f8S!jڲFAmqaJnOB$$qa="N50d {uɷԁ1 ݰOӚ+/Dsm$8bӀjN֡q "*BӖ5Jz臜/UL.+Y<َe#_KV= HD+UrlOP xhfBzNfT(pr3w|y=d<.(xv̇GQdV#7| WŸ% =EyUM `G bxHM4M<ȴUrЮ*dG>ﵬ|cM-C);T+<{ p<)=eS .wD%![uJ{3TtK27[| 41dWUd! }" 9ASgW}(rl.?wxﳇz 2j}J OQ@x׳O#z\)&-wZ67X٠5.~ I>D5\&uP EZѢ|Z~ERL ޻~5M9: h uuR!ۇ%V녺S䇵TY҃ǂZ9rH汪v5_I6 <͚0~܀JB%""a͘`]NĀ)U$ i%  ( d0;e@ *ȜQ Bb%D2aĩITPQctxgQfNן~8G*"iL#%4@PL) ӿ)9:G"P#$((RJ$x@<}WBs&<"er#DC)1S'>r!H&R"<`%C#cwN\inN,N6R9F7'ʧzDZL9 <*E@EơaKʙB*]}_b^'›{o$G& fl_eτQlxQN;7'o8EMF񍜋F&-uƍJH.2̏)B(bө/d45JVD219&!17f#'aگ'62Rw3 ߯O^zY w CgoL5́[>,}{&%o @M yV-'̏"HL6jA(6gyal9gyf;l+Q̓ 7^ ŝ/֋ _,μڀ*2q*v6uO{(<˲*Zm/*PPJA>e1@Q7tNTXyhWx{^悱Z+8UB}nO@7s kVmlr?wgJ ( LWJ@Zc'"" DNY>x?ܹZ|>h6ɩ#zNbI1z> "GwoR P@4 HDy\vZRkmc#BDBI/"$0gbLB ><`0dy(.-dtS@i|´Z{7ZϾ`6 w0gge^5ot*OT"E">޽<{*Qîat3\pypYwfJƩ`o}hѪo2$6?*CNE>?|:b I)F<Đaa!D$4ЧP`G pBDʴFA Ỹp=OvLSW{.کxPKxmTt(7ܜtڶRzMӑP1~Xxf5CKEK߱Lĥ削)LQ2$sQ0O'zXFY(CLlajai[S2yںqFA x~PɨIRࣦk[JvuM"ɨ|w]r1p2tb9>Bn>;(7'yۖtܯ珹5bN|&)[ߵo)36O;*q` `q*Uy}6zPvFcbxqNx%NF?fT6}ح!>{?ziJǷ+UgpHeG*k~zJ&[M!/^qNQc3 &MM$o"c Q$@7jސ"IPA|?L{+9"E(qQ9H%a3~Z-Qs_A fotO^yxĢ[~9BuMipAc%41'8FˉA aǘFt% :TsS!+^xHns'0wv`˖5.DuL3;yEaF 5;UP DS!XG$08} 1;Bh˥y܇ueB -k *cHָߵ)rqVرƕ F74U 3:9QO4;N<;sȺvgU$7 }SQ:pqJCvr|# jZ<"Y?T]jqGe<|ա`Ǻu vVpȚLVԴUޙJISÅ;;P;,֬lADjbs椛e&cByO[hل:-4+ղe, iICFpXmz[5U>ho1|1O}viHn5N i@AYQ4#]ї٧Zxw vl3CM ħaƤηd==m2(/;DPmXa*qӓLcmꡇ{ʿeoV e!!@G8[ЇL:WrUFnt[zXc tu@NrlcrsӾ{V Z^#wu,t3P/D>:Zٲe$v߭r{6.$wJqW![ݭx;dsA|AL I༕XZ} g߼:CxC^Y#Q;kTLk`q9;;HCSsyPxxW ǴCtFmVٽa9_޼Ѩe_(y(R.! `u9GSQe t4bܻFոݔv~Hrk(Lώ%sMLP"0]x<Ў*b=wpDufw#3Rp 5F &^W<=ZJ Ǣ*zk􎛁V5ʅ#X3ձdA >uNeCElWe򫃆Dh꾿KqRh4Fjp/ESǷb"*!i~߿Q u\VK3 /J^/ԝZ&?ZqIvӯ]ӽ*q%RB>]1{pVDhl/|bIɶ[RmWy~7᭒|w7BɏU2ˇ۵X zp ۂW~11k~|#*/ݩ8"U|')ɯ/ϻ'^-UR2́$3gMVRvM LUEj~W}ē*/ҦR=V_ySo$)%$Zڠ@3JK>$Q 䓀}҇q Z'wB XpřjC^ 4GeyA_ ŝfYWg>QG%<{7D.3Ju@3\fxM0]+4zu0ͦmJkѶ;m~~+3UK`3֩{g9@{A=l1vcq.5*^rn~Hgٵ+(({ /R V`Sx9=7ath=aOX c[gw\ƽ+Fi{Pgųz~JaaOZ,SF7nP.7OX|x(^P.ݵaw|'ӛ\+?u0g&Y49.M(0*]+y=ɣP,DO>v0I_*e.RytU.h4%(ɏf #I @v0i]=$^lX Qİ:pPn% $VΗ$ĂtbM`1B8'"f"hj2g]R}1bwzs?߸G8߸/F]3u?jYju/ϭyø_j%9-!H,YxDϜ`%XĚ$1e{f R<Ҡ6IHY?@΍Wi#9opGE^\懹u`yłpxi`~i;e*-\rz*3U:5\kO^؎c[ަ<^~}yR\>Q7q.>1K)ddb5b!Q,+wƙ"1U.6Ϗ|v Ћbsf@4\CNEpi8Wgr))94B !'n\&26 "W6%c$"ʎlg _ ج!G "3:xI قq@)4t4Z%8@,\Po u0h}bܔu G# 5VLҙdDPD萎 !-p0:3tG?pңo;+]+\H{Ub1;L$jBĔIEX$f:gȮRG'c[`inqD\3g"-2kU G2Poono=XHmAp`MD9!`@<,dFe;bOzѓ}aHd_o{8r!DP.d2^%,s…hdYdAI${tKcl:ܗ;laE'Ť(tI\ _X̋r =v r51$J" }HѷL: |?! n6az>0VxE]x4q=q񎢆Tuՠ%XfJ{E 5dHe2Y`譁xgQI(4dhJ4 ]a^&>q9%,'W!u)/|%ǬHdho265 BYŒJ!1LꍛD 1"š\FUyT2~|F=2 mO<&YxRG!#lV',{f@c{cei|ŶY[={3}- C8%{WǨa7ܰ^ Q4l*W ӧj4Lڮmtr'ٙ }~HMiX8MC=v@z:>|:$j,hZTBNJ^u9e;WRƦXܽ νϴ<@BĽsZ-2L;O/xĨ2sp;IlE&@@|{L.ѧ$viVSO}cjb4ɍ{Ym~n~~> [Jo{V'R[+MD42Z^Ԛm<{Q-5jc7R,Z̃Zw-Ǖ2׬?5VxT؄$Z2y]%sª&Ch2 \~=ʹpH뉂 + [_? ZMR{U0. Lez%Qɬ>&<̙†̝(yp z0 iXxc%m褚 &uA*Yif6V6B?nY]C)#=WgixCsY[rA*!^y$ Izi%:S@B.i0 m R2M!Ocs$e$&)HhK$Jq5[;:=z=>>Rof9__:/Xl>m8l cƮBc{mJ{^i+zz4QCNh / %,B09gd%Ry766ٶmܻ6;,VZvd(Pv4a0T]lfԵV@o&55Zjy*=]HFJq!rAKe1PAR%pQ(W C8zB[ݰ΄tŨq3Эfhs?.a#ƐY70fQyW*J^<[< hƕ&b1X2̣GM"PкqoFFbØ-lWa,&Q]L׎hY LDY kVPJwֿ&^NkP.~IZ C:Jp5@օC&aSRw!9%rH qkDwAN)`(Y\z6S#fR1#rH9RK)yn@SYFXa dZSRO!i Т̯]qK9N9{6!T&sx̨t$F#t pĘ!dFV@BJlTbQc l>&0.1B=҃t쌐V'O'g"ՁeT| d^%ƬҌ=3,X&3S $$5$+> 6x̅]PZ'.T?i4, ˻&ǭN[Ƴh">:.͜To}Saw!UsgjrG\T_q UWdNHWp-Ad刣svސe8rI:/G)4z~nj]倞XĖ2n/4okBCi8:-ב+fEH_s׈6^#NNgj ?&Gaӯ בˁ]8%ƽUF&.h~u58k͉WםQ|#vә3?̃J\욓]L%WD|˩X[K[[u}Kgm͈f$7ͬb٘F>1->,&zj֟089Unnu1ȶVƪp꤬GP}1I(WEh#^fBC7<,,>{X8;])w$z_e_{M {?zo~~I;iS p.sܢ_5ӫ6o𮚦Mlؠi]z>᷵K&ۚ`-~>اT Ia?6Y~nrJg"a6YZ`?߯ΆVbYlDBd)GڭfyFkn_F|M:)!HZZw.H28H$cx"l䃄GBHvT"K;ϣ*jb`Ct m趫K4je}1JvQ}J@Y[0Ud]5"RѰMFvWih2n\@1!y0d"Ȅ* !!Ŭ%M4"8vgzn'}y/7ӅV~uw#>IMJ㧳H*ǃuHps>w7uK@]2́y&?eGc eᾖޅ|< F?iyZ˨5H@PHq#2 tHmd:0\fx6jh-r3-z\Ie9)?sr 9 Kz":NMe[Xasr!aOGLs]2x4涵'K9* 4$?e1ek^GfRwe>+iZ'ի6c],S'JPh:LH/8Q55' kK{Hj媟f¨$7_O;ж -fŢlV]fեlV]fe^|dExә ҙBa2\gܔ RH:TI-&N3d'n ][MG3w3qZM Um22Lj,^[`M@/TJ,h^ J$@+9[o5jn"@ZMAL*'‹4OA2O4̂LȂiW YfˢF0T'5';{^kZ "k]f$FlF+[N*8ۖjRezWd&WѓΩ'OO؈3dikm#WE}h" g_}cD=LŲdNvlv}U*6)`P2*OA #$bc0caKmL 9iAQgBktMJfyBA QOYbDw̤kI[[|X!2_ Z Z Y/hm/Zk"R03o JtHYK xAjTDMEGq}-Y84Oot%KF;OpylR4Ri*UcXo4yd$M\g y,:d9!^r/,s<kY]M'ޠ ͷt6yI> +} Y Ҫm1Y|lQ8_xoK">tƂp}G+bH LS٘J`*4!)HVrɲWɢW^/=q 00"0A%ZbQꌠJR19 KV&YJH*&cNi&4^'/^Ku.>ۺno;'ٯe>Sy6!0mWw\,2_j,'Y'<)(i(wyȁc!9Č(fk*WM[X~(84n7WWUo$l1GӬfhsM뷥vx"&y=޴Cx)>رh0lnwmNe*dU[[ZJ[ -n fq5<2j:|p[9s逹ځkŃchY~l:DPՇ٩tߠeRqag 'kۇ^4z_h73-wNZn>hm]Qt8J]ùs;&fR镇0;F0'b{f>y'] κgu@<ђez-Z=AθM> kHt.&n~qFKsaųe_f `'z)Z҆# wŢÎ끷'Iџ#59Grohq7g IvaO}ge2q~EgZBeQQqyY1XW(EhyteIAXz=C(0bYUIJchiM>^dvd B%W =!m8 ez6/?6dr]aujئ"ڟ֣iw/.sjRCr`T7Ok/wT⪣brI\ &*"cɤI@B UE:`L @N ;8]B CR7, VZBsKڤ`6VR^,> JШ^Hdju|ۤa#Bgmdg @l!5*`(c,,c[g.htY*Q c l5vM-  E=}b $$:hR,ޕ̫ -xd |]3F̖#AUVoT+g$ ˰N:;N!Wddِ% 4%$^)(!VRX(+ WZmыzVDģJb[ҽ'5rG)~|s@ToӣVz$4s.JES$H1?/zz.MD|MPL)/RasAQEǼՐ/֠`1qGX`'AGoRt~F'ݹI6Y>v Wۥ(-a'ˀ!D&iz68,:dSK X-}n4SkF*J' Iqgb.$ŜQHBRNs"&$tƩ8+t!ΈDLK4[Tꕧ&Q|@&#S;9'L0䛯I(,UhCwavsX6i6]L6oc"G"RN9,!m q F3>I5:'iŰbdsd\P,arRc"$_l(*i{(Փ;~$}]lv8w܁Xzw._JW4]h'J`Mv\zp )TZƣ&9! gޱ#K<\hӗbO|ɴABE 1휔I@ OBYt$Jl:'HMIⳏw>eƻmt|DLǹUFuo4&Lhy[=i^ۤc˵92 2 W0{iLU6~+Z#gAyٷCL@NDFk^RތiI-?Zn%UlzT8dCJI&bZ]E$7p:^-@xGIAk̩ϐ1]4Ptv5{gɳ:%0"I6TK 3cz0Ŵ,@NC@QJ۴ ck`<1zu<$on U# 6~um ۥs}<[`q6!i(bLĂ@վI&9W#'"BB5#Yܦq8;z KE%M@NDGT/c|9Ѥ?>9I#6A@̾qd#YN 5}Lfy$;eTUs. OOG(YS'b&u)%IUD%Pb:xm[E4,#{?uv#m^m'SyUvTxt'jkpajtw}s;k</Ρk5e0>R-{^k[nnv&BI9,~vE{:΁LJS]Mkqhi.g]2c__ +TѶLW,ɥ"T֍ 4.됝F0F3z+.VzmM~dI( Y:РS͎k"k'(sbH {} Ap+9C"ۛˎpYl II]K(F*,(Kj>(̓hBPGKt !! `wXϳ@ MEU={.&uD=y¼C_) 41⍭en\F-tު|&!ll(sDgk?_kpٰ$ʕ32< Ch*QA"m B#kuH%9[Xa`s/P$'t DYC HҊj(~ dٞUs`qvs'U skTiކDawwԦ$~@_ */W5t02[<+|1.m>>IwF Vm5yp/ڴqFKat)k ~9V\\!/ENKX~oז?$eHBPdXz &+][*C4Ki[?nB< i[ErrF[c%xzi7\`U.~`w=ۉOl>UL6zD֣yb_[WϯG{gN̞M%y=)OafלDg,|#u'/;ftFuFKͺ!ir':ՊLߗ=}َtUw]>d]zwm Z"?{WǑJ "Ȍ0^=^`2`B"!IIүȾx/Ev* "Ůd^qtqZ72&4]13gz ::;剐_D~}7~~Ïц޼o~8.+~ }>}hZ[ -fCs 6@;%s!}>Sc4-}͟?N'Rӛvt3">mqůȴϧrI.n84o!!{,zt0ug?O?>zLF}Fw>@F1YHb;ȥl팱Dtr fZ4I}ڰyW.t)n"J9 ّܔA&;՘T BK2YWJ<\rsPuzRkv^FE/3.U~D*1?}:8:ķ}dX>`@a .:&RCѣ=kzTѕGBL6^TBպ B*TrkVX1t*'1Wޤ+Nzć{vYt %1>|_k\itd=:8{ ުt'tAC!vǞ܂#T"GM(XfAF) /CY,]Ő˫??~-6`΁V)f6{`\զ9)lєKB{پo=n[h˫?_tڝQTV>h~¾F5VZ;EC˜ebcUJj {cPl LP }JzkϾYCjWYzH>e{2ʈROvd}{؃IV*ؑrA քieOvLz'!Lck&[A)Q`-pʕAΫ뉵PK9/Uej k[q`}%'kjĺlV".)P'kwmW"jwTSp2YFnSϦ릤skiyrQ2"n #|rQ쇁y{a9z;$}|5ӛdut޻zZoOwoH)Ǧ·1o)}Yi]|LD] Uw( !}!ߵf/>;Va1*F!vPb1 kbا.[T k+-\4A`P LEEJ\gMgG>;@eEBLJzjYYhbDX MYhؘ)=ZfDXOxR.w{rL; jCN/~;uMAuƢr -Ռɥ(!{bT"fӕB+Ap"8*~7x!!].stA@P!l=C,Ƀ}i׭'|ȡ0 ul4ԤSELTZM*=LHٛH[pmֆc q`c Ɩ;z;{<\2خ-'MieQf $!ap6lOl zUyvyk>%Z\\rgU-NTB1B璂䒂KʚDм0$kMT)[RP吴Y{b hF?9@KVKz@ý)I3*& _=ަ.k^^Z'IagFےS`U'bC͡ XGd.P w^srHsN)M?hF6 nT (EiyjoDT;_iB}XTKE*M""[ UHŒ5ʦc>D(?,qTm|uZ !Wo!W|"m}*$.TJlccWeVa@~{gȞ'^YW ǁ{:4,ȧX4M5fuL .tWg}dż{. ̊966>Un/oCt<k?OJ'K?2] ]u <}6;)NnT2jw2I]ӏD2!=lT>^kn|aʵ}UMG)~Ӭµ%Ag%ĿQjO'T":&gegug7oOa {Z0h$]&Bzj-a}%} 8+Jr@)' .6r\kfYZfg=g֗XTiG{cף j4CAzmvCTO_LQۤ g/媃1]glA.h:UktIE=s壥GKϖUYYޢZ2@>q4Xݻ6lوRՑ"jc-d9*blW ɴL3D/w^wJަ2:ܖ_LvIM2/dl7JMdO[EҧT4mϞW?:9dzo]ݟg}9B]6t3whV4}Jh q"!H!@-HRkBp Vx]J>{8>1EPjDD\HmcN!FErf_Y]>3m:;d "ϯ>NA׸}O[f]~GP[[:i˞5&'k6!¹B ϵpԺ$7Nmrޕo^wuMd/ՆCtiI[nzsɖNm'ٸe'-9;oR\󝖛\[5B㺛7钇ox^PA; K\,tWO5uoϗ=]vdv>nOfmr>'rh؄ ٠#Ess5)[}RLƞk}%(֎9h0@-dZC5S,NtsJ]nuy/ϾY7)f܏e5k($_EJ=eM2v&(߁El)^hν yb8T!u=g^gțMQYljܜᠲ%FGZ!T3 YR˒!![x{1܈\NlU2EkJ j檣rڢ⠨Xꬵ\arEdVouOO/NHCJ |˹Xt Ylm:})?Β^eft: ;̏{/ !D2VN-E)Y 16Gֱh4^r wS2 %(jEaȦFUmv5xcɕ(V!*P/#}M]1@[UUrNSWMJʕ`Y$"@u ⎕6RŐ޵sFː\3z`Jӯ(iSM)嘣Ku 6ɤdzGkG医wrI-EIy*+揲oM2n\\(}+Ga*F-: zu|}~yχ^]ɀ,ב˅]?nDO(WonaT[ފGQ~snO > f9ۓZ;DiޞB|˭=+㿂RFy pH1kگ.*,b*d.6-~1>혧g ըͣ.nuku |;]rQ秋ԺE5q=ioG/I0nC$$3X3`F2| n"%bwWz>EWw:JmTN;ׅ\t~~~?~?~#HRb5SH=k1;Ctk%547̆CkscoT9NakƽK> ȧ2f;n*p+]{UgX9n~҅LfEI Wb=_W`@Wn{ܸjV lܕX"EpGkT%W;~nv/>'/>7ƍ$)ax `YGc¦\ Wmm2N #7\FC6Lx^Yc~}R5EGV RgZ %tVkokoD{xWS@F&[ηZfP ZB oɾa s~׊Vr~:LbIE%VZR kV7m:tz;"3|-,q0E*l$dJY3PI #9 y0yl[5п1W X唒$u~$O'yk;J)`(`JZP̐X"OB`iiA 8` cHjyrW>0FʩR ˽);]l'p9%;XNA~ WydEElc /s?JDYl^(֢T.Y(#Bp)Ԣ-&x#32x5sf"C?QRWCZ(1Ik  85j J?u;Q|F3LʾQk %/P!`VT=R:" U9xV># \H☁ 3P  D96d#:y䠰;b-4hL+ : DH[XA`IrY$53Gc9hlsKi-vHJ{vXIP?jS[gGrJgՆ0n `?|<+ `92K$)5P$ֳhK)i)@i Ċcc2&Si^rC5O%z{I9DŽRn&x9 fŢ`;X+\$JG٦]].pgRL!) p u?˿^?!'к뽫|*#b:˷x9X\tuUy<--g QpW+PYG5WHtJ%RЈ]jAA!+mM HtFnة(3H;zHӪHm' m>cT11Xh-y tP[MW獛Pc;^؏v01Qp@x@%Uc4Gj1ݜFNofPӝC1T0MP!)?!)~Q\Q0ԠsX liQ9/`zRHD16 6kCq^ܙZ`fN h]xK`[lTYm):b!;%JKE:;۴bmo"4?juCֲR1K/k瑶! }\92HqݲۿLRʹ{M-,|gs8MI8{C܇fսU-6w3N-fF0Y.Z|2WPfK-a>a.Y-G&((6iEJb'mHͭ6w B{L2M^þF-;&.trUt n`r$%TG"|y>@  ?x c1ŽaFcq2z{Ь7bít׃VqOi6 Yq%f͈wch$ˆ&x<`Žq'4'$rh'5"Ό{Y(F7#L'F<<#ϔM!Xc1pЂxtZ(%O"XTf,n<܄yզUG+i]ܗO{z_SpC!fH ^ wE%格@U޶:jATz˛PL!t}m($| QGM% $BtZ W\%nyѰWSߌv3i>~ x1:nwp(m_sZo[j0 څ Lwmi=Ed1``2`mw+Om׃}ú2@JF/tJÃ&rEo. &cu7d7ɾnfN#z}zq)&֌R2`K.-Sp'O-M34"GF 䁼Q!%an(3'N߄UdVws$tZ:Nd*M̳% A C$6Q ]w8klKdCʞݏIdUG)ʱ8詖TբgaD 5YK!RU1Ùw.t:ĘaśɚBӬ)whW&@b EU&m7|^M&,8.gHM1~zrW+#ZVrsYLrz4i0B"$'vGOy1Hk\[ʟ:^uyq1wX]*ΐszPE1>;.9}i9ճon%ow =A#Է!ya?^8 0haWEκ0;i0k}璹m^Ïk4rrq_9 9i7Qr;)jU9q*. z5(.~e,o&wʁ\9[NNͰPNʤa ㋨I\+/1}vr]?qN7ͽ& na7yHw>Ōe}JPSoy5f&hXRoQ,c-)T 5,U^:QZX) X16EDIC- jAfvX3 _)m[^W;kPO~yur /N~rU6+32xog@bI$;of$\.%saǨU1 : | \K,QFeVM<ћuϋ4خ}B)ד#C$}eM'lB&&kd;M#x#kES$.POTPJql9PI`9\ޤ`%mfNP^G*#R'1.NfEwZI%PD0@AP(@2= QLX! neZ9nhĪX?<&:c:j.bW.7о EBC8jyEe fpZ1Ҏh__[z|N Q#rU[Ei py+u&X޷6߮IO\h^.a<~y|xq1[WݿW~j/p=X|~_&M%1|g5'`|9z~OķΙ~x/]ӈi$Ӭ/IJ Ǘ|),Zrr>8 ~u,3ubwKmMWxyﳏH\)ܖ0i,Mnړrt3oK\ H#+lb/qqvU [_EPb+9ҍ@{.pˮMb}tK<soIؠd G JuIhn' RGc$A 21y Fs{2i)zN&s.XHKMIգH,DDtiE'Y, ]:C*:'PMPA 5~fssMΈ.L>~e◌aN?YrɷߠB<f$rh3Q"43a_ubMFuQd2H i=-t xND)_Wed^Hh=ڔg3b|Y*c%o9/>=cއ ::J(L9b.Z0b!K,DtLCϼUwNwx-u@?LCvza\wPmo߼})[xaOްw%: 3\[#D.DP(|rK G&Gד|Q(Tefm&o$W,ZnY7oSjn:Ŭ%Pb#$[}0ɦ+)n@'Xt]fQ4i+;)z5Luus*w㻻#ꪌg6̦#>d__[؝hr֖ȧourqmǝr$xbRy؂U%q6XHbKYrANKz)`r9`1nd\IHՖՖ[2nG)/" QơUՊʳͮKû;zWw0NoqQ͟hhth8]CJ',$$H4il4 JVd3YbY(ʛrCհ6-[D58,kcCW$&LQ%fJȢ(\4$P)"YPhCaܗ=eJH : ]]2Xb_X:zN9紧WaOߋ ~iE^N8떶GM2"p 9ä}!%sTR:.[Bf8`.5 h<7INHf'}c_f'1&ĬLFe7 eV ADtLHEYDEjͅ:D&ҞgER(`/yc k,vt솒$;(`pKiS?7jDࣇ/1e)RbXOl~`ӝ<&8mAoftSZ6aҢ'I" Ɛ`0-? z;6BM }K a{RߔrZWme.[r~gw~q &z$ɲӞt{ BpXد ỵE#f ߱",&x\//[s3fϘ/ǘuϡ|uŒ333֎a1O|W1uګ)tעRk!^ ,EVҶ5d\Š 2~Y/NIm{Iߖ/p*lxGRIܴ-z75״gHܲ1S‹'^B}qoYB}w61OK .XaKZdw,;6h1j7yK_݌Ƥwq28 7[&Zń-M;uҚ.Nzu(Z/9y](oEMcJ /O pZ)rB^d!W<7"I6z1u } L)5KFZ]t7,k?P{m5Oּ) ç1m0NW]G>#y+K0)%4*xrS@) lwL&DK~V:$|^(}Dwz a!)H3%UdK2E3ko E]å}6_8XzZ8]J`F㊂4B wMJAf&]׫?/w Bo'ǙIztav:Eoۊ$hl{BHq귑!c&{'q2 (>)y) y7zxO/O9kIJ$3}Mu@zED!A`vV+K@ FE'X)s LC܂dZ8d6d0D3慠)Ϲ@c%W7]k-gV $CO}ᮯ/Ì.B!Ɲ׾Z&\O"vL|EwUll՗ߨ(ŵK3䳔%K6'4sY"BUj%YZ t @9nI |ٹʅ^d&3p74@cIj#kę\ i6 iI؁ ” oCeJxZh(v3HD/+Yg-_}{l<B%Zxi2Zk@*',w @007I;Ir&(z_ /rD A .'D2" 3d4 G&)QeA8/.bå ^FrpIV"a$QBr&1# .KH *¤AaRw=LJIx7GM#4ffCPo[r6yi'B*Ih\qs[=zM˥}2]ȩ02m^p#1xܱH\!&+nWZʀ"I(={c;~(Pp}NiQ1АJp굴Wǥ(-('ˀ>䅂MFV6Ho7$ab"`9wĩ8d@{Y [VnBq!bUY)J?&@zi=329y" I8ra?n L2ƛI^j0mR d #~PBڜy߉122u>{'&܄ DHȒTޚ|T\p@MB%ɖP3ϒmZΞCSs 4"h" I]NK*DdA[L[u9aK*L gs $։^dZ|F3^IAlc\Qqcԑ,eJ A YDɢ^N&KE}Mv;Vmq@srzTF},rMFS'K5@FIojs"hB'9-8]]I ҧ);S7៊wRmT=Ų~2HEcz9%]Aͧ|tFt ro@'D%1՛*t^}:pV@1mqR*x|R sZe*nYSNOqoqu7IE4gx2SAzm[Gy{h#S֪'z˃լP۴ #4j4p]:M#  '@;d~P϶K2OLo$(Fd]9wC/L~$gRx94Rk.!J05lϊ2:R{\ gK5\y>ܔB'sxx9G>'{levlJ>/}4IWW|&Q u \FAcx8 Ld+'zg˄8fBʄp2!Ct9ge<ҙ$#-GD&KkE1  p#tR +\d˜x ^)Շ*#j3m.#y>iA;¾=Cfw?K/y9C6t7 ,&= ^m!dw=,tuMYSi8(u$m7.3jښZaKwpKI˺ou#-]7Vxϵw[*+Dn$mR19SW]m/EN )x60sjҺ`%݊s];58bN%7=/`CS|iӤ3v`ՠPrYo8:cfM]OҮk)=c臋ۗShn/59J_09H D/\Oe)㪖y|2dg炟o::x4\ ҊaBS}ʴ@=o$Qt6y 9 ;~x_кdJN&/OO*|L"dN~X [/3gcOpɨ4x/<0$l U$F hpI6.Pu}.p|0[{_/ߜcQHT**| AIg!{sbK( Dq:YBo'&ho9^ }dC"B>=>=o4~Fg% ~-,O^1KTBH+ֺ)Fi˰>9]yL`D^z5Pa3 π֒KT0J*~GG74F:20s%>݂Fy{gk^R'gQjR jLHQ,(!$ $s}B]kn3%QԕɃL8@SV zѢ DE =T'X'uyq (uʙd̕tPpKdF  aH o |+XA 'u .DP(60,9n*Ƶ-OJ44i =fD|&$I$h"^nPCTj11|hf #N\Kkdn][z);%kq.[z6lmR%ctOf༃rt9Υ|_bgXϭt:44EW?n0DY %s~pJ59wL0—NBrjGxzNGcTWgZh*yDbibNnϙbN R*?ׁ 1t,ПuKֹ[ԯM۴Z?Ul]M+u3 7h'.|5IxOM[w&NZ1ysrE.1C4>#~S{q9YYn+$n5w| dH #]SvEV"bP'\يoBcv Lͣ6\7꺹Jf>(FB]ߌbg:YiSL6[םN:tOw|/>xo?ϔ?E\q 1O&O"3O~}hEK 5CcذFt.ۂ/05M3kZ8~xM0ߚj tL: b~5ۼ.jR[WV!}1Y P.'!x|ňSPN~+X 6Ho QX$51fZ0= 4@(EZn9Rx╔|@KOzW(r27/ЮWB\KreEHZWDT.j[9 "i{I@s=gzG(M22y%WƵ^&\PCJcTP[~ްJLv?#WTʆK0͕gfXL篹?îUhd!y8с#jWW_u^;I+O9G-Q%%i|@c 3=j"Zcp/8BY>Eh\*^eDaH'M4L*92`)kLHjлAL&$R2 ,Z#ՁHOžc1rvpk@-{;%n'Z؛eCQ۾/1rQsKNJRz"y“(e#n<9MV5T` K \=)ZcUBTҕeK]^gntW O bW>h#EB_}饜iu;+W$)Wi"uqf+I S/Tߡ+۴aQ ag[Obȅr,j'C3Z+KdL(H,zQKH\TjR `82MU s,PK+#g:q8"޿]տjwؕr^ZF׻& JVCRDtcP['FRjE>j8 $Z-"Sd 'Mg&[2x%/vio(mPsuEW_a%@9uN*ιYRM37(2#;+W[-ky5lVe^E̋ƶiBs-OvAmh6קonν!~l"̽`zu(k_~ct{J=s@`&sVqy֬|JRΧDu8({ {9/rs BJ*)"hHFh  B11̋1C^ KTP$HE!!Z橏PnT҉"ڗTS"/]lruBr1>yԅeg,XX/0۷N$L E5Ƌ"ATCȀ.E"'q L+˝PeV)¢!qPD(Ʊ`c)AR-lB%UZ3#gfDB 5ory;FF~zş4l7_~j5k!60ceɥ\6D02AJٮݠcŸZZ{_hJB*0xEGj D%L{>]ĺD ݨ-l~)VZ[1)AE$AKY\@3QFI{ (nϜ2fVZIۧh ۃP̪xUj +ϽJ{X0LW+~!!E/PdRb0rPU:T q"vdR{2M:RФ$JLD[]85NeZCO!/M2΅He?$gSG'"<:X$jjICؤ nU ms~N|~A^0 X]u1br)bnBm(&1YcTŕvIDTK1u6PgSYu;W{Uopq `-fw>_"F/C_ΫE>>x~_/:'۰O6[076^` zڵKٿ#ǷqZi(foRU9|;B?_n\ Zn@Qk(§tj9[ɼG2[lpYF,XP_񆪤-H*ir27LnnpEGpvŒ /L|,nqohjNj{fі>o l,x 63I15 4mKumn;1JMDOdس6 Ԙu杲҂DNZ"l٥P`@ya^Ԩq؟Jȍry8yYbs,0-1yf 2FgcF#Y4*oREq d 8>lRұʺMGJ=z?Q{!{|+p>PDUYm.J+(joor79&'MHtɸ6ehyJ!2r1eQuo&A0gqTQGJNT+́Sk3[) + N1T}*ǧ;EH UpLlTȕ\'R%SR"=;sQJ_ cy~US16!u_ɆXؚ֩gj=:RX#5 Uo*Ǟj=cٲ)5jˁ+z"ϳw *>gZayQQ([Dz]ʲ;uK-sdZ֔fGcn({P@Ȼh铏ؒں/}[OZc 9Pi)Ɗh$jɹojOe.66uRRS+AxoZ,~6әK1N6tHѝP"w1DGa&ȏQ )<9[=ɟ8yH^6 k:ts|>C} e@}}x\\7mV^z2ϐr<0|~8=>`h`ݾq !_`)cw{lCҬh=PLVziUV#p1?f׈J-WzS|+(ז]} ˻/,,`70HNp1>FSmjqR뭨;[ǹ+Ev]ͥ5f/6 cUF%xllŷ?|شzJ,pǛ[>~+饨'ǿNxO _{_S r%UGf)=Mm_xlX}Hv{rϳyqironEAWyɦ\]G-v1^\j4)] $3r~.MǏRLČ j=>mCPeͭUavo^l|4nE;ĥѷǧ$pyʛ.7o!zkK :dm0S}u8O;}@uȇߌuSD'[W3yx:MGĘS5)g!.ߍz3@?Wvό} }UNvwoNcoSm\zzzmA㮋YW.ծkg،f!w3uv-`Uﱣh,E ~1٥+zZ&i~VfSވ?/3R$8B׎bXk% ꩊSױ>ЄmsņG`kPx*8y o#%csV a `tO~+oH^I1IH{ 6sd.!BX)"UFSpMh8gmm}Nq":|iNiQ!9MP%;}lZ@K "M k@]Y5k!!:XE cSDVlG@%D˂Whbs)dR),łb@ [Lvo,*+Up AkO8:]:8;Y P`yU%J|Y`,ތl P˔p5e j ޠLV\ ǩ I&ҠeB`d 5Ia ]g8DIԆQy$]\Ss99p &7^ajnlPd.  8T :qD[H\,Z@ { L+{Pl#fP57]IkiLc3FM[N@$vNaؓ}YH&IYAA. D5*k#!a1%j! D3-,#2ma$+@7TBD\&EVS4fDw"< e_H) vJ ith[]b4iq܁5e㐅 nNa8FaI!rc.rn+ 4GŬSFp|#c@TAN"&d_f&WnMñv*λ=& nSjF;* u`i086|A骄֓xZm#̠u1 )O v1!N *,Rh+* <@(I&bZ#D^WeHT`FR͋ޣH^ J'Ɋ סx  cT\MXduqV& :`(󚜎wl\ldi%4X.Oc>^̚ @n@ebyr W%uL 0b<׹#EiHvo᜝5Jp9 %38 q ! ݂2l 5;%, Җ2YxtZ0Z{t&3ee#[ 0q3j>N h`: 'mYiXJv#c9XB$ct2Z,UN [B:fNFLt3g ̪ m;#Rk-fW2i7kl!@#?Oގ(A 0MYB6b|w<_M_Czv_7.˴g+02J@0uPquK@3.Wfmp0v[Egf1 gZ*)[׫ 9%AMYA#ϧg3J3rM NPs͑UrAmvH)(}E:8B l R J3@OlrEXF7o04 xSq!KMܺ%@4L&rN?k7 +oV2\8BH*r !=%F5\:OGg50?y!6?{ȭb.~<,݃g&y9`XH[l.Ւ,S#ɮI-fEW4ӒCL 1׊WjeRBUuU`SIACY*k&IS'ID H]5W 'p;M[Vq |9(cfƲ2& D,Y(shU:8rB/@hV5bap˂G-k5Aұ.DS7 <!JciؓA'7 aY#I2BAle4F}E_hK-Ul0:z͖m pD$|w B,^{P@`_CIJ*QVW8D4JGVo10jz?x׏dz!P4xͬlBZbyd> _~t gMk k祛Clԝcۥkhf,m/?jR Ͽ4%@qCAuHi'dQHFHCBI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $B@:E%%t&Tw΅2s$Ԓ[$^# d }@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H\,K"H [/m= IWHiZDBI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BՒ@k/ C4x1$P.SBJz$P]$+" I $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $^ xk=[5մo6?R0͞G%Q\5b=.=|Fr fʫtʿ?_wW̃`z⛫q JҘ@`zeAd`_K?$jFK0!K(c<]6>]£7+MnzO0ܰƴo_>/ހy_淹}hȟ~x9Mzx_2F55J{c@sRtcq7 Ft۟~/tc ڼĥTLm1DF$A%2(+_?T~֙6NPGP Ho)6\\۠^a;lefgųGCC0&w;)=٦_ʯm.krMec<'bLVY2ba),[eVL÷ͲӁʬ=퉫S}y/:X/+>F4Y XtޝAvWȢGկ_@>oym?w^ZͲ4|{p릸Onr uKK<>|~ex*}lBX4\li.Pp͔}& FO/wdf=;'{@RNT'elָd8/Bh ᭷֡V1nlWhBtOӔ~&Π3c9m ׷EOu*9L!Ni&<.xxXv,oB$(!ѬT&Ɖr0Q̆(ا ON @qǾOu6Ej)[X BG2B΋*),_ B>M详寅~䏾[V%6r7rr6vž:zu! f o],g8g1mnE~ap Iڍ>K6Go>ꮹTYeh7h7gi7G[F|ݔ 䡁wzjAG ִt+=5Mgq{e27X&& AdcW'6,4l!G}ӥj ;e(U' e X 4qŦ8|H3rLdֿUyxq&Kws{ru*a o\nX_Rn8Y{wphIv2ԙ1U*T8li{I( oWbٕ2w3m(}C'Rim.NP}Ɇ9p}YQ8bTWg&N*̍Fnq"k^d:cco!4Lq}1WMgzHڟh=8 c/(QT s1JuDmƜ{6 Dm jGKFA.Sk>4Pi| PIEm& 7YXsS10!4bBɬ%yO+=ؑ:K>M\@8FISm}qr)H[J_o7i-%/)>] Eƫ7it5 N=6~{{uuȭ>^>0mA=mכ\>ۏpavdh39Nm.Zl9o;2 <l10WeJUf%}^ L=8bY?o^=|?mJ } -"&1ڒ;GwA>ro5tk_v}x;+YYzȁxs*N jYJDjOJ.ONԥĘ%UWVk&^qeZuvhz_fݛ iRU7)Ro*k~r髬xJ:Yo>9k璻=ܹ-].v~/|wgf+yM9ƔCk-KBԈh>'q hǽu(s{bA)דB9^Lz]>*(epR=cUI|aq/ /.kTmvW{/~p7? ؖ2+'*4!o/]H68-G"Q)&D5< hqV&W:(QF!IXU΁Yej̭UۏGq-<]w)뵻k=?Ri#!E4HP,1GQ1HI;WVn J8sǡV="zJ`ΩJS`A4'Lh4 j,]T=ZZ[E[b?5к3Tw`|?~"HvdQ"o${ c(޷ϥI&fH㹍2ȲRQvV8VP?yv_3:iMV:iA2e&tA;b Dvxi$݃TKg2Nmt-f^rrm4DJrL"i%Y4ҥh剛jcqx7-1nA=Ui`RA'q`wz֕w(x߲U*+zIN"`q3 o$D$rt9,-ɹ5LBt*WVKäԥH"ޚ 3xQIA#-ђ(*J9'KQRR$YĄ/ R">EbٹlYji@U˃(htN62A0IJbي5 ̨\{O\Y'AX$l]}jL n MZ&:A`aHJD3sIA57%NbcM093})<})br;u`:3`vJQRFRY5;^R4p0-TdH/DB}WT2'Qx&Qm$,[]ɠzQ wύV.Ag֤eNj Qp&m1ML<UTIn5۳YߓzF•\9)#3Vfkʺ[r WS™)GU2Υ ܺI묭͂,220S3x V\jٽ ͊y"ziwGN&[DN5[9yDs" 7NHeXCPHqFhC@PpC Cgτ0gDBJ$ %MY(K-7&9kau4QƱR,)Ɯ8yR #O/ږYa7(z1 ]t=wUɭUÇkn]Bm`o'񶟟8x(~wzۚQɫwޝSGkwnxKyq}~4w̳+|GѥڙG_{-;\]s[I羧ᬕQ|rreC6rDF :E C SDK9Ej+o AT{)pJs*O%Zu8|ѥTKa|dyOF>*=#nuM;^n. O h3,qu'"DC'%bd>s-=Iv: 3-6j&yqvv-r/ٻ6dWYlFꮮNzYmuY "P9̱@KO֑08-F8:ql Ng:(ޝ+(ѝy,yLCkcpql\'˳Z=d~bD͏'$q/gnY^EgٿL:HIVKnQ *&/Eum}][_׶umhDT@ VEDV[pTn/{]*N$Ch @*Ep.VH,)SXHdT. LNg'P.ܻ'fBܼf>l5wZS?LTU_S|Lj*2 Ƹ 4&RʔuBYT"̺*LHRI'94D^eѻJe(.+Kkt4ʳQ]sg}v\KʲnW_UuWU1Ŵjy\G/YZYPu҆bT^ &X]y}/҆{(j^׮d%xBQQ#^ UKȋ E"`DJQ^Ē\ϑGB{*MH5:]XџT^FR"8-GZ۝^~QFkʺ{W9/8?ʵNgHzՆ9m^C"lxep.&'XM 9AH*1gNC,:ION':UYڙN|#*k7]n_rD==7B;Slޟ:#s M7b 4ki<1"e1Y1E4JB^ȘSql=v۽-rN(0$G^Eb JF"L"e/::~?.;8? Dyy9I-ɴk)O@r>u 0{';;#Aٽ8FœWxV Ԁpj@ϩG/5 &btRNb{zep(EpU65pUŵv[JhJIwJiB-$ZqƻbiWUJ=\BBz+M ~T7ͤ|yn}~}-<ݛk:DӨdMmK8ѵ[(6ʺ PA1 ߻] ? F!).hMebv.XLvi'ɉ&5ЁB>Yq̕[m>~3tPfq¸WMP#X+N}mL_1hUP%KH)')6@vH79g0$("wZ#-F"L IJD[[Q0ڃ/"HƜbB#t\ә80&oRK5jbFn<OjW,6Oy p:y=wI>~}FRE0Wd*6J̅>hkU-:y |oxglwv0ikƖ_>/=jQX}AER{շBWR}Mmfv{ZCWq`GmfXZ%6xJ)&~Ln4l4D4)٥Mr.8}qz|O:R'Ql TX s@kPs\AM%Ʈwގ:>wד->9-sMZFɬӺGLDLRV]4[\q]lRgmNUU5zpP_8l[: >6}~9oƛK3vFAB q17Ћҵon_JƠr..J)Ȁv5Kkm#9oA0b߻@8lKv0*H-I٫?Ë(CRHㅥ5g]SL We{_oi>_jNb}ߞ2[ȯi+9~܇E[(0[^n:)7/V>rS |/ ӐBW ÜL {Zì*A6PDWSאC2{tyϳ%A'^(^Y(n7zxw SKk0;Ŕ"Q&m< ?Ȗm:ꦇCЅj> _Ңr]r{MKnW͡;^f#e"A\ޗlkZ+س0s‹&ae>P2'[wny{5\F_W"fn|aVh묖Zul{q»D̗lbfDksѼ.T2օ6I;,Շ m~ݠyrI {߁\vda^DZ)C ApѤg/D 'naS^ɛOΔm7zvaSuUy %aujP{['E c=~nXqRTU1WK]yy)ysXv9.<Ѷ;O㩆'LFrS"e"Rl{G=sF Ph<Ǵo-*SiL }DBNN++ڮ9Gtf ڙ>1zSzLy*efS&ZTN( Gb$1 ՙ+zikoGhDЋLSX+·>=B< r!d 9BV0@Rd} c:~5yt@&ͽ薪e7ǥ(5(a'뀄>D&Z3[6U6|RHc2ٻJ*/1Vd6^G‰BR!)s!)C!)IٍBl4Osg9(q*`,14.pk"p Zũe`V1CҢ'("g'Ry3PFH>|S[#0N 9$Qٟ5{W1[F)ZH6gwdL%rtF˜lBt׮0 ܥDȢTޚ0('LLB%-fbۭ&[#!H$|璒}KG ! At&~ SS~HRl[JTkڄP[Lgk%fBbN[C&m"@ؒ>b>Elv.H$:r<-*,W5cTIQdL Yōd4g1*f+l.e$ Q^:$cJgN!kq+}Mh OhK#w>%I.RRVTPbv5-l b^{_eD$Z N(}۞-i:q*=5ǒ$=Oiqp?)ry ?IS>2L;OV ae *d06&҅SŴ_`.K08[0Ì.cJ{Hu*He@%_ƷKEJ/B-]%_{#X<@zt1jJhTJcu\@xʕ +=GJxw߇ Xolނ[QV# uN[crrȓUL$9s@-̱w 6r;p;5ۤWhW}28evJgڦg*z1BzŠt8ZnWnLfOE .ZT@S Y)|ӛ=m-?)4E? 5=PϾH dG& ZERQ`zSbuF(a|n"@^6gR MqRk..Ht)"fMV((8@ k,s8vLk6>_K|wg[Mphω7 O{W~ u*q*#VpRWv z[O&j/J܂9! 9gerpUh)%0jփE6%Y1%.\ȕJ\d#eeP:ޔbi-#92zv#}>6T\|us-[4oΝG xmڲ\`n=fA+3{`>zբ1.lwȽ-ێ+WC_`~Fdߐ oP]-nqWA#k=}0m\u~~M{[_=~pGt ~=W^~ѥd[ o= k}뱁Miؾ&pVڡx۞[Rė)IBUdq]dϖ (+ᘭQV tk-8~ٸ] Fz|((d(@Ǣgق<ҐD 9B_\>spPzz>lx-wFtd|B% k2B19b.5"dTb W(4RJ{~ BY5$VaU r**J/$ɑ $#uh!s{<.kQyeuGO^ IgŤu[) Zɤch*aLi؀BFo{&B'ab DMIYNVȞet쬐`'˓31 /{zAjWR3%K*1fôade  $DҐLnX$x]Yh}|=rʻkn_awgTW.RrH/`7ߚ7UH2;vV|(K2͇ep&4Xh&A3nP2=>d+Yv%Jq WlM1刳+vvn ~q|.l xyXa)sK60 i'ů%7кzxrP, OV[O7nWV#k_j=pqy5Fkex~&\~ s Efkd>-*O&4D#ުO>nhr䆘KhǣJ]٭Md:zQ|$,8Mk yLJi8\fU~!i2`Ybŧzs.G='nUi֦g5vy9Hk )kbMg }=Oi1g|0]rwZ=/MatuA'RFן~,?O\/?@OJN $Y( L??mMM&ሩѴԫ?4/A-M#7P|ߊO̎$׿0͗]ꓟrtk3#U$+D623y45UF۷.cB@|*1iFT&,'qO,?zN5QB;wN%1~4KOAvC::J)+U(]TO=zVy k22dE%#-G&$L&G)fmUo)*Nz'&0Nzڡ$VZ>_NJrNv~C~ZFa^}t2(qщБFܣn7څ ̔[tU&PJIDBF! KSe>3^ ҡg\(̂VVې3)-mW9 ,TsQ(e4FIJX"GB2ZJLDHd [Fs̻P]ͦ߼ldW~l Bw]G]/.lei&C1mBj0$K Ƚ1 ΁lD@y6Fo-rMz&!qABerrZ!49tmwmg9.ٹtsW//xOLn+zD-OhQk 'n8r:mQkOhqTV FoWApoH]˜7 ߊ"j%B0`ͫvTbZ#TX[ ~6Jg6e!M=M}]MV)k> >ԂΊm_Wki<-dnZ6P-Wi=V SJ- dnl^W[m:Zb:{Ƨ#|eϊOqqm73؅quɐk>)l \_2Y֌=7(ykÕS6m5JLNƋ>)-|R5|& 6m2PMu:j ?cnK8E{ djсSD^s5zz B.Si2yhS*m: >]UlB^fu G3])&Ne 9 ڻ~yѸxϼzHԄ?4ѐ RiƼ!ͱtH3:O95&cN  p줎V#vL56M44,LhbjgtfZƚ(mԼ?U) BГ r%! O00I$U\TOAx?Ur^%=.k;ILc^ :1_d|lvaq *.:VhBP-5' QӉ\"NEM'j_M'*nT U 7,Q,{62=o_3=`2̚PtT5 p100]sءCH8H@ŇG%" *S!΋l3oU Uy+vNeA&LM vn=XVb%e&qQro#ܻrV}V ԮaWv_l[9vWKZh`EOe's9Q[#h[>Et,:)ukNF]%rUvӪ"RW`u: bd@UvѪ]Rn*FTa&@&1ޢYsc-ɃLGk VkV EЈlTqW``˂{i)N i0L2"Z9U=^S Rzyb7=k?NB)+t`nZ$G8~e"y:b |m)_Ӵگ:/2c8|)݃(dCmH@[$- hE"mH@[$- hײ-˻՗17`G:"" 3|n%>؋#geݭ ~|V֎ʚ~Qzt&/Uv+vƊW.sW z~V~uyII)h|v}`f#[+<-;"YQFT TaN0RIh¸ D*õT[c^5 \RN I8F.elJ)µm߂%ȳscliOhpTY] ĩF1BRSar-? XjkF 1>0 aArMEZFM"I#0 VL=31LQ> IHpF5hHȃZV1<;8=΋'k=kIe[G0]z}gKx![zsNʤr&2=k)dG+՛gXhʋJ3{9b9S2yRяa@rM!`q}0ugf4 ׶{{6 > ]W&?@0!,K KMvŌ\ q ^Aӹo|կA?l` 5RQ,Oߪy Ιӫ"ƗIsKuFo` =Qe(|QvV\k[˛Ue-DSbN/v;ʙz65'Q~IJo:!Ҟ_nn(+YzB?tyi\wkG  WjB3=u9DBROaqfR4)ȿ{XhT)W4*'ޕuzuW?2/o^~ǟ70Q.x//af`\&[AFzoAB?_}757̆-.u5V yEL|([nWǷA7T'?ig֪`W4e(lNf|O1YfUWu싵],F*IIRH62DŽM\ W6DL`'-c Wކ vmvpzC!F<֑"8j0rL=6=iPg7=wqs44w hK7݃{bm#h(ht dZlj_ KxdQWb?ȹž]ql Jc1F1sɭ`<I1i/, ~WtץG%]0("ƴkDΉ 삏\*0W8X8)`k U cJ#68R$#/gsQ P6#g3NWH6.0db+7нuzݝ.IJ]Fo4>_cc+%8qsl##CGw|m(D(\!([JL.0"4-8jィ 87ԑH ƀcIZ0=JIDa(4 0T8 yS*KKI S"!&"!s$# @'(}^A8U%ag]_|1͈|bCV#>J|q!je e,\bK$NkFBo${V)l_e_ bX9Kmk<}S)e0MK}2)P# LTRަ|)ahFˆ@^} 8]"a9!s>gA\carEL&uTmG-'jQǤ"KiѤՂ3K{ ƛ;SxH5i)Pʍ"6Ȁ*cZ)"VD 2c&mcl6iACw4w-ƭ9i}hCozz\O֥e>Z,b !fs/!;X#{=IhFkSlΘ󹎈 .εHl>Ql=24%qүunǤbaRog8Yz_uriIn<8$˻;.~qQ7.{vCXsw:E I$XՇ쫑ufWVnZ9J9X}"aood^@;+R4vZ#aj$ т77S0UKCe[\"`x9gYBYZx`X Dw7,ĶMDkKm6wf4 ݶ dzi xSZܜ t䶊E]sy8JݎW?TLS^jqNw׷FڼF:aeAՋ&)o$eЊ h ;~;ZLQ)MԨh"m]sv\uX4qoi*6DDE_E2X|ƳMyցrb(jԞ8Nxr4}ry)>)'io-Dqƈu3uDEXp8 J lݶ{;q`SxcTw#O_XuQY )~1 P{N$K;cFZe|૏a4QGZ(ڜ-rm|`w1Yku5Ѿҽc2[/7k͏²`L230oB6Kh oCAGٗ^ Cβa -ߝIj˵y@XIºgL.xWy~voSr6dmw,n-z$%qu}Ӆr7[_P:LbNE9VflJhC21 C`@+,!n*P?@@AMe( KT8c<).3*۵cWlG0/ݙwRt =0 /J|D%L @JFYFep(uNhC23CEG&e d`+82F>Tgc<̜x& }Ac_D#q@ SAfv9ȫa`A0얊Qd@euJ97tŐTJř`6HRș4 )O##bg̏:0.W;ٙ싋0​Oc m%o sd։%:# yǾx:C>mٮ؂Α[W5x,J$*rg*a]4 yhw]&6p-\jBJqs]U )jӽ=j;jls^Oh{i53J$%*@ ;Q2Q"ELt>ON dLRBK&S&FCcۙ9;_kt|?=<]iMkZJՒw.Q7]N6:oz%^^zPSRDFp< (*EF7 Y|)hr)&&Ɗ0!*mAI3ǢH `]୷glR9bHdINٱ~>S)YVZzǤ.*irf੦HΓ"&5nb|_Lg+),&7I@UHK s|$P;Zh&qc 8l-*4x:Ra*|T D%":~,J|`~k/uqp۪˂, ˁ].ˁL̅:4ˀ) "Fl.fKuM *Gh-xY[YR=x5h:Atf*:s;/m琮G2a*(C RI! Zf{̿/0*d^R}C5v43/9Y.9?tYx(ߦOͣÔu,f.'aG8JnҀ|qZfbo)*2tqN̹foђ:nILucY$0WRǾSo.>AV38yyhav7u:aQ77Z\tyǶe Vjutt\\&/O ;{we⺴{h9?x)e|8Ѳ: 5{Puku%pMqFkM!vH)Uu1>$lJ5c;Up S.Ej,*Y*irΒ*iH "q@X 76Ix&>.~Ui3$a[OJw2SʪGz/?qɳk.iZՠ >@#JGa^w[+[ޫ4QUU"dl?TFI(Hi(A)2h):gQ.2)GβP\.t+D:P! JSDAD@ȐPNAv. @bɨ/jHV6[:Ip>D}c8~d.h)`K 5V0 .p- 4eʽ d-ƭd&l)NYK^1Q2ɜsК1qZW4 ni?9_ D^i׬p9nj݉ii|RW"{;fX*Lk7Uszv8pN[.z}n7潦[v +ߜW;\ẏa6>pч׆oyIVx./ߦ"v_WZAs2Y}Jm͟'7CQ>j.бUkQ`ZP0 $:|)o'{Ji#JBLMT J7 P>kLHsm$玊ϾF6җW MEPrVђJ3sAx*NqhZTD/tVTPh,1_w`lV*mpB8fAc@oHI?k١ch6kfx_ [!u]~mߺlΗ_PJC@2&W:u* -NU6A(tPA{'O! oEnìfV}sh2Y'{MQ&EQP2KV$%eb:e"dYx/ czkVa5EN=^[euiaЁP.+m0{1RO4~Ůn BeOO]1%~;Py*Vksucr}i뿾dYmɷ@`l~1t>蛏/hڒٸm=I ӏ׾6-'Ղsx5NA7݀gs.*es&2gc/mzAzb_̦FO^Z*.X߂UЁiG˥'\k6hy5ĚvwvwTf˻ˉ h+U̿dyT<}._4.UuE4JРwq>e_RDY,.Aᅥ\I O]]:t߾HeuWk̏6J=~Kr]$Y='BzYTlI~2&"bMrSݔ6&ܛRMD/.\}?G7A83B~͓Ws[EWo (߽+ /_''ӫj??v_X˸v׸O -XJR-~gݶl5sے6M3?S=J7-U!]*I2K*_Mn%dOHSޯ:&j 3בv>xk"bET %gDIHI: ̙@fNrpJwf@8(.f]u){2f 74A@$ljgMVb63fyv1w ; o W_ߡ3wauZբτ  CUSP*WAR AXy( vjI l))'#RjAkd̜0Ky,3B7`QaE..ʢg<^qBl~#̘pJY,/?cvF6$Xm*P?@@ػ6r$t l$" vp`{_/+ya+adI-6ǯb E:)əȰ6ͦYbˋd\nF;Jm)Jm)= u/N%|D%HQV)F˾buj=hC%P"[2IYDS{ESxgC*:Fu&ߺ@lQ?X$b3U"Qqk SAf&ɳ`}P0 sf떊Q8ddmR0@FPG#&ImlI+$$1p%bl0ryI7ccu6] S.Fy'ޘV[k5TF1Z' KtFh B9,\܇\<y،:vXnp"lQXbEگ[#jzcaY9 !,g Li^m%w΋ԝSǢ0h+٦3Idm*grTݑ>F^5=Ԉl6e̬"Rj h ʻ`S"F,2RhD:!\AQ鬵PZh6%twGBS'4<`p!nUɞ <Q'4[1@AL@4ܦxS6Mm.B I5s1pp+3F AG@kEu+e IgBQ?!"-n*@2FT`. mHHT;XaT0qSA3tV,!%h1ߓcو5 p2%;ͻ]9=Ѵ .:ktvt'mϺɷ8G.2D@%j3Z`-VകΊ&рɑl{ N(s娵¥TRV#sB(DO&DYRQfgT@6cDU =&hJP^-05]"x޼3,?} Y {I٭wV;U|?X?ӞmWCVPlIz7z?=M7s_eFͯ];7T~|hꪸF={{g]NގG_/oƋ^/?\fpOoke.iI0S%Cr:kj6P'C͓#y" KƗZ>{(Q&Қ{Եb:mDyTPyAǗ}ޢ|ž CϵCFx;ɵSuWm]-Ob&Cddzҕ͙Ƶ]v~3-3#/ݹƗkƛ_|=-h~7ws6gbq)Ko] Hs[~a߶G{p~>|FTio &)E/cSs<: 3wq׾ZasSiBϘ3z&7=3є'ox*~W'+i5ȗҡbRfvKaFs=AD}9یWGcpFdX|=3vv3lڍcm+vFgܙ>sFv1ŭ;)0R"z)4XtfJjt| zÞ1x~˹܏FhmLFS+|;5 U&뢂P+(B F hĚ8 D/t)EcexmД`l.dN):))h D5-YOi{4q뗳W4=lZތ*#x~)XcK۵㨾^_>uFRX(#WЃh"8*sM씹tד%<oF[B . sΰm:rRr`cBґxSQ "QyDxn7TYɱ0_39曇l)Y+~Ƙ2r0xiEd{+`.٢C 3ʅ=ڱ-b[;QȮas7X?_ʷt-գϯ] D $ p-&}{%ـ'+>o]&vߡpc :b<$E]H>=;)&&myEA[ϙyQ$@]CE zXfҎD5̉mV23[rLbsNU?HȚ 䯥w rA ƠBR сdUMSl×(?"tBƸ_*,&՛ju/[#rVY*2N|pz#,w櫞0׋8hc1P6( VGZT8L_PdV |t")?X;}}Vq u@?wB;p)%>ya.1g9_NADi좵z=Tz;ףݹ9v+x7>]n.z6#wRH}?VPWޟ˲Eߊ݊PBdkCN,B&TKF)b,$6x0ժ;x:]8imxyszDϧbtU y{9߽Wmto~xqbőkrޥ̇O?{eظe~ ;w Sj[eUpmAPKlQ+jR[RR!y)޽GݴHjgu@,>CLKFa :8XZ*Pzj%a\t^ igsd1w2Cp Sf.Ej,,*i48g JpqJ{{V;_I ~֗ﷵ֊iݝǂf˶;>xcGM}ۘqE8Lc͒q5ЏhL;oQ,9uUΟ/if=Fg:e-tEE@7O1C–w M~FO~(2 0@BKsl N9p*ȢW Pvd>9uB`1:ZLN:km ? ([#6؉ReXJk_|)vs)5'ާ{6{x쭷`R66=gEV➖{mXk}mxoy~S}@Ko ,aqW-ĥ֤MUg.(952z*A ..MBal~{LNz^ϐXZX"H`VRl SdBm"s.x/bIvHⰟ9c7|⠳DR } U?D|\Q%բ'/5 daHa{dH#<칺'~tWOVݽoHtU SU^@7 qm}aCq.7Yc_D^/>ukp`vrzJJQnS+\Rf({o~CkUϷZRUc}QmEVluEʶЋ(ð\p /on,"6|Pbl-81qE+(6N(TJN>fCLđ98Cu{ GިS>!+IZi'"b E\L@ gH(H@$镤mX`e["Ž!p8P:G(jc!FÌd>4Y ad@:E:;ؘx!jby. uWN)B,|8?oz)!Jۇlz7ɆJʣYe\;HuTBYh\啐5wrz,''NNќ%JJȨ%Jyp!Im^f,^s=!'"Q7$a<[ҪfߧPWe NyoY/`O'YZg^{;Ndhk5:E_n+z4KExaFjRZ9ăj'5j<19kusw 4&q$RMY߫ߡovNw_̟sAm ڎ WhL3ZbqTim͢K:γY%Yhc-m84l6D7EK?S0{mOPٽO:扲\ RfY%@JQ="<" z`x`dGU2FuҨi4Nʨ4u) E*X% $R$30əByP)".\L[{;7?-ɯsL>.>nMXbcl_/ޭ 64jyM{Ɵ6B_8f@LfD#:p)k-P-u48"ch+,\J6gfN{9!=s}^.͝h49 GlK! AIDi"&9B4*xB`A4SLr`P6 7<n=pg6A hfA'O6 R i)َ~2sWP#jSR3u#V$P5 ULb1͉Bq#7AI 19pp%xT̐ Ԣ" ER qD C@*Z{a1qa/<20 "wEDZi="n#aQӈ:9q,0[#rRTh#ŢhME[n&)-X8**y4!BsQώ#jX/|wi(+.{\|L\ 8-:i2\8QiRJJО"G'p$fEq!pqW+xa[}K5rn0?Ly161Cp9':gnQi-zu{>qw2NW(p11IPJ@%fU`'oX:;^=F NVƒ@U"q0ho9 5D *eNRRƹt[$)=&R@=ƘgZiSxΝKBg!^]e_f^~=}F$1az4^'=|$U)+28I.kyk4V#ˏ33e ?.R$ɇ9EaN|"{.YRlvZ&A i#G)4( 0\h5&i&u0x =T' WVt0hW nmzpfMOEMl!ܻ;+ rm7RI."j6=*EBUbU 0'ez)Eڱ7HC![\g}+BlYJ{4{S".JBUqAR_HcuQN%SwTJbQ)FCu4=zQۮgLrQ -Nj QȩpF-8SR$*ɭ5'yV8mG `\_!/|~c]_ڟ{+i~u0VY-V=Z1`TLl26uF/qZXu'U3W_yYER>(>n{'ݲ6kK•\9G*#LN1d$YƤh喡ʯjaYf]f}^zm66K99o?{Ƒ_pCɮA5~ʄ%R!)U%JJ=2l3zWu8|ɾ3^mh.uv!ث>Ϥ Mb*p5{)r {\r \ic͎̆?>~:*^&];>0π;듒*듬$=M29Ke \.jd -NRJQGKT,\7;ӭדKO|O%?fb=CϥCww8r 䝠MFcsW~4\Lһۤ黇Ntf}wm?X`'Ƈhtxwk7wiGSZ느A{c?#d?z&2ii +#|-k&f;rkJ<jh ]!,OI$ ӄg ;t<1g(&f\|u١_D&>gfD,-0 T'cUK0a:-:A/&Ew>MGa{_.]r(%d':+ v+V6qys)jň$:g;2&p9L:ZR֭JeK\” GDȢޚ 3xIjR}%̲uMd(E}㔒|9KX*#8KJ]҈`>rRѥ--DM'RpI`(],t:1Xi*$PdoeTߡĶ 7:;-ODA;o^/(MW bԓK}1l"HPړ8B`1le̚ nT3lV. Xr8Ң־e;i[w3w>;P+%:r6d0?8?7oT&JuiT&Jj@n ĀULU^Vjߑu|%mvF0&tC%2YsY%ifNsc)*\*?&;۷ӳy\K2[}vKڦOYA8yAŒUR'ˎsFIZMpH<&Q(:Z*k1& Q:o@Ֆ ۨa)ֱ( FFgD6zXaŒ <アpZitr\]}P0%K 8 w0!GL7@ dף6 ׊H]Ajõ x-df] #BHJ`- =ۨ$iiښ{:yMfɄvkoEUjYŝHHE eIZ/(o񠼉H&GD.1L-ϞepTv 'ɓ3! =!uKH9L`^Y1T=,&ӆC),qH&(-fNo=Hȷ͹7l.|,~sUKԜ;)#%8px oM:U2&xVe>aw1U3Vyjt#@^?Pil@5Sx&ن<C@qM1svpIdɔXA:Q,9IxcYZQMbW,iqO ᴀ/#SПc\4O4l-߳KPR lM8 rpIbaɉ$>xFqWWV4{j/.Fظ~kmn|10fU1 &Oa>|hwVή8hŴ 2![liڙ$>ٺiiDuUTyG$p'E=Y98LXvVuu[.gxESH }:2$U\qPڀƿ1n/TNl0loWap/Č|˿?O=w~s$qB;0MtV^evk~ ){Skjjo>5 LmUm}ʶ;Xyo%4 ,nC@B~00YM2 "\Wl"{`R T -ذߗ@dGkM*,ǟ1>K<,"헂߸v2DB|qDR#bIA)A"2Mo$=a&#yq*fLN2!D7EIVt<*OG62`AL>IPgSu_IqDŽviW>yN󗮷?~FnCx2]ΘZJO^I,,E soKڸȄr3V5^YFe$,`*s B!syp s28'-9/xkb$c!-zФ8o|zRglH~!SNчo5og״0l_$d:Y}<˝%Un]}p쵣gӎ,؅v4&z\6KEE\[ q,o ]`ga ^5~&oG^uwb@~NgV|2"+Ru;p|y6PrVN$)Eb +Tdn3l謑wWY良Q^իf]hRBo4\c'ҽ(tU}lNnb[ нU(hxÌ-uT^jqlTG;^ /j~:onweG4A旟EWeb{={ɼl{P6VG堄gǿz΍3>6Z /mkg#^v`́F0H6\vj)%C.D\TS5lkA>nkA=i: ] gˡJ:d'WZ:H,aQYm@^D0Ar-1K`|$c\Dm*;ɓ3"LfvlvΆw&Q>ٹ[Ғqx>\΀fsQ}]5MI{j# ~cWFaPb%#x ^)&́JDȾ㞠ҠY,ҙd18S%nrr5pDTe%;~ U x<M󺌎ŽK_pѭƦhaw#aFq0ᕗrxr߁]2JҦ|m2LN^pW+`\}ޮ죄%L щpsS,ڃ<K [cc*(AYT-P!FSf"gj(`cHdqM{jn?~yBF+= Y{HOL}uU^PUoVsÚ'/LV P1Q'ꟿvQ}kLpоBPo?z45KN W7~i!4AZYh: I.dj]b9' O-E&yu'aXD!œN|!Mzi{n 6%\ď V :.sΘ.oU@|WM>y9g3gƾRq_3Oju0;LK۽dݿN^#s1'iA@ZfkKG^'[Boy|E,ެZk鬪. cR-O9MHU#U}T>RGHU#U}GPjgͦ:6{fw}oCŷP;CiFAݯ%Y͐8R ,#|SUJDwmmW=ļ_8y 0vE#g[v<őSe'QS ,.5O]0eF +qur((}/Ϳ_ dm"]=ҘYTdrPǏOvbvy։pa"?--f4f-2??v`Yβ-RVd!EEe}CڞsS?h,WKɶk.Q1.U$JNx,:]*[+.sbz!2Svf**cEPSI F_lBnV>Z'^_'o֊9ZTN ˌ4B検^y0%b6U_YGM+x@"x7d]eq&$!{*޲Q\Oh) YBƻ$AY% 1>/]@Ƿy齹ʅ*eJlзsPjΪ%Zmz$fCIS.S&0.m8W醅,P, / FԾCc6;R)h0qr303!!?95)|>F%YK$ `U UnV'{9kl%A*I5ahY,Bbg6݈ob jwQ[oDnRDVI",h X]ɱ[khٓ銇H9^;j'Qt 5ĪFF!đ[Z2UOHP͐zun<炈PDԝQ8"֤j֔},fEƔM QB$hSXFH}1j%IlY5Ԇ384XH$mڕv3"vy,: .ΆsZg7+9͈#.'.yTc >j#~ N^'ׂAyɁK(x \<IYUX^ﮓXST>X/F5K59l}pvsgAB&URw64f$/uY/L>Fy" ʹpAm1\n.⥈h.Fi(Ew=zttX:+ّva\0Mwɱu_=`RbW\jb}BrM?P$ ٘XAcՐDVm}+q9-a+_3ǯv۫r3J,3M=IbzY.ﷳ&O&6(,pE) M9$WYT(lyi#0p` *%gqD. + A;8Wms딳AS7JN3^Gyφcjrվ~ /׻]uruzosۻY*n\> <ɮPO1AFDh!fЌGq?JFɜ[`_=gDߎĐDSDFX'MoЂ-_~*aj4ʤ'րӉ⽾˓LX29ըrusRJ&M꧃׿,NYS >/A5_"E#{Ұ_d'ě/܅\f 0)_8S"X1"D$/J1xPuTra#5<dZcmΟ^8rua.Z ӛ;Rm131#a9A'mr\kwi6y}ӕ|tYᑇ<:Y-[6G:6.{lqą^@#}]%kt r"_GɵqoW@ź`$MM78V1 Q>j] yb8;}C%Hy_^  jF=D n2Nu5'6ʃTmZRp<kwWy7Uդl@ fkUYhϬ*X Z42e+/Q7ظ~e{K]nlg8=O#91&mϗ'=hÐM˄CTBx4c\f9(vnZVP!V"c!0hfAmI/k.%ݺklp9z5 u?A:E_55"{K!ÐR۞JshEI.]l?W켷n?qDUk-VWw',~߫3\vcJt-șnkߺLsWMw<}|eJʘ'cGncl~;o 7Ss{oZbJTnSC) /T^ʏŸ9f9 'HΰJQsN~f/aU :֗`^{a'UcM@ŐYGVlA5!4PS}t* 1,vn~of-./%JG{[w9CDPO:6Qޫ%@| ^`׷Z3@O(1Šوc$Q $" B̷"} ,/rۑjuZ~rFpNQ {5yVTSLrXQAĠT3&S8Y8f8zG_x;̀R0H9(6}27_D@\B4!}j,U;Fo""c{q=;f"|m=#Mb JjuTEdR!09 oW61"yt~Dƞ;Yȡ;{G64ٜ&&My?]1dN$Z*!aL69Q XxA%XeV_N 6\k.kw^X`BtXvi2א8jN V " dٓr xt2e1 3 @[Z]m% /a5NSEU* ,4LZ,k!*1Ǩ.r;jROLFw gwX]&v|?G1՚1Ke-%ZuI)6UJQ[фPC:B.Gtb_TdSs6G,.bf\%^<%|pO6?蘍bZN(ĒM&ZՋ=TSur)B6,T^V7V+SZ(U7 ,cSߑTfRD)U4,ICw:y`0>-ltN>fMdVT]EG\010-RR5ŪM}|K)|ݽ/m-Aʼnv/Mn2[5/a}zv7}H uC5lg  %FLh虯Z|5O] \]=9{_{ez8ݐ!ٯl'<&uڸK.ZUW3efqY|=9[<:kH54 F:f^)yz>}vqL_Wv'=vupE}FXL#=8Fú\v+]Jx\/<UN\B6#9bZwR[ȳȳ6Zo4D9{7{xa0m[gy{#;^pd: JZʡ{oԹKU`A2:H(ZpQ|:<1엹KzυwH1DP@4:li `RdF "GB|d1yDk1J8EKzq n?Xd.hU1cJ\׈U?{V0/pl"Y7ӽ@70`4^m!$[<ڱl:d$PdY VvcŐԤ ? 70)*kX-ZѲ.Hcd:Yd= B6U6g 8:Gy#{ c*tɂ/(:SB $lV Rx=ωнq'B9A\I!emY<|-*A|xx)F y u+H{ 2JJ4{aEtNDaG R(H,=P>R?sʭ3 ~L,Nڰch ?ŷ&kUX2&xW/0apG}[bM4XsGdxt<]Yʮ FѯA{fҞwZu#\;G:]7Z; HvUV"h'^uN =Y9<^19PvT%u>u!MQ'> %>$ePơp2j?jo~rdxXWUף4H0,$ κh0o!V8{,zt^@ݙ&)O[߸d:mAGmB=uIK ji#1пHL]o$=i\_Fܿ¹>njfG{T)*䨒,Q;̘RKOܿyI/i7K\t)6G]. y6!; R.۸1m l=΁ /RS43f$aBƜ Zjɪu,o܏n*A f|(cߌ[=_S!~ZWMK/9``cvq} ߩǼ;4İKtx#}I!l4129LYm=<#iAm+t//mf{s?s /륆 f@2ҘH)S끬pԜəy79($r$!a.FjM6Ȋʕ쥵J!$Qg/W EYwExb g[ /+zrGteanot`BLXKi0 ڦ П? )SyzwdCR1Sy/aKC+Jҹ萱dL$F&A&6koX 'ƹa f\nx(2 ͘Vs˵wY~Ӫ/u4?k~IL9)9@a!%&goِ7D-b,#1ڸٗzR&BDXo$FʊB)ƙh!m PE/ I^nM; Hu|xď~.װeZ}3L`NOjQyeMBl_dB{3>ܭ=}5Y֜a]TrtQTT.ΪjT2xs(otd5X4Yu6hZ52@] !H)Od:h_Uȹ= 'CIl&/2gWEH6 Vyx{GuS|PP!ʘ "ْ2d¡̄:Yj* G qP* ߱ۖf$C]*@aP d1ҐFgcv0ZVJV&vE`(CB2#g4C UE׭%FUi:cOбuFΆrwq׏yʏ%SBD"guBě1D/]tPQFd$t&rCT^^ϯgz=c7ђ YȚZeŜO$T2 "b3@l@LLZzԊ'vz;imӝ2 fw^@6TųF:2BjFYqG'cҝt$z7٦sL0:t.z۳Mv6IH^TX"uw"´DYy=EE'%u2Ɣٔ+H'"s^ q+f*XӤg"k( xt' 0վHGgXpK{iðRGZGܤi4&!X $!wݍ4 re82Kqej-}we*g){Gѷ(~*ͬ5wAM*7ڀhPj 4qGtn=p'%N9ga*<@tPrFR1"j"{#; _=ہk6m-j~^TdNxqw27XWZ먛.jPl^UA,&:FLASu잭>0"CPPF;)(YB޻eR7EK2:T#IPRaePta$9;COYcձ#MgyS Pktě s=iz?sJ?KQI7 ݟr^Mד.`t5N00)]_:dR!I/*d9heCIkSʻnl(V"IʙY"R"[$$} Su;#gCwn~ 5~GW=ѰF~A'v{ןo"4pME>m*z}VD8fz osm(>?oJ䝤Mj79u0/Ctnpvu3m}w=.ܟl; lfܺyw-7=/,yWZTO>^yfޤWxVivYOWf}5ӟ7ԓnMssq>T;W,6ߑm$L#bRKIIԢ}OITZOIR D4)IT0D$"TFyamtjҸ>1GkDKxuF]O`ϐnE(eFD 7H$*)u6zR]Vy$཈%>Dw%|CJrGɛB2$.]L%PªYCR2`@_*ԛ1TaQ%jѲ.Hc$gkzٳ dX%j:*\(iSn2c*sɂ/(:SB $YQ,?{WF/Z$ gv3`2|ȒG8WnɒmYMRLf5|Y|R|"X+bA#bD0 Q(64,9& *5D3OJih3 1IWiB"tOpo&@RpI@bbֽnx&?լ;?u[;-zj~ 8z޴8!x_@ߚ.Șޱ20s5?wjGksr:怂|sahwÕ̋Ӽ%FW 尃srp~Qt`:C<;CSk WG'$:T(47+l?7F'ǹ,O)|\6гBmx;5%v[{G)gH!p~;Y<$y)Nqr|M1>V-k^bv~pLn#E c( 'MrsݲKDxŬ>toѴ8z+YWkkUVéZ[rWC6Ѹ'l܊yW^98y8Y+#:u{V0\:{q>Br_o}1qe7MO`\)vǿ t?1e迎?3_fkL0<لߟdO0V;U+Zjo^5 @ɧn|SHٖ#izkgvn=8˻Qq0mId?~=gG<Ȅ+(+h6f_h0U [g>捘Gֳz+! ׭ ^)j(tA;I 1F py!W&u@8Rx `^mc%._ ŽgP%Z`4vt)zeI5ӒYBZF5ݦNMʠZO|'T}^?BLuWu-Kp@rؾ *]=_Iy2+!ʕQTu%V̐8NJ>IK[MB+m4A1'\QC{H!`->$FwթBI[4}as6[wL`*myf 5l2 'gW Cl;vt;#j.0f)Akt3*IDR 3{SXw$vUHI<ϭqEBI#pҀW!Vē&P&eL0 Lj   @mm"0s,fW'|2r=I qę>\F~a739 VN=bvڔY:T:X}!75.5#x2y“(̳n<9Mf5TM5 `HHouaYҞ-U1AVhKyɳq$Һ-ŵ_}2wz,_ȳԈe U"3 dpIiW)R%7Z/*'TΠ#-T IŮj- )2#XSJ$ тx@9 (U6I@Z"x5Ǹ aj`T3 .夏 [<7"R.8,$XE6צ~N v4mg4v4#$/WVm-Nd| 1I+M@W"Z^9l ,:ۅlH)-,q ) !Jpp|99oh®F*|n>}}u~2,jr"$[>wKЊ(>j"qn-{ۥQSr VbWෝB zkRjtČ8Jzr9ԃTsW k26*M3!T|WɮW;[{sC뭅wi)n.ps9&u{֋1--$aq#c˚>ր_҃?84!ea2PBw?|ǫZME-牴!$vX'+j,}!ϋ :Nn@QbQ88¶\ oOw 7o2tyֆPc(/E!'}q)9!sGh`ZY*J!$爖' 'C8,rU1n)ȸJ#c1s6#c9R bpp%"35{&C^M}{o Tv8?dfHBQHNg$NZMrJvŀI.,=j5O!;{60ce<"(fJ!$&Lr *]x\]Ab㑨MIYnסv`IoW4*R@ #S'k i9ʩp (& b= R^Q!քH< $1g օO~S-}p9viw""-C=Dr48hn=,chFw0 6E0G)A!7Rpj(ʃ,eq*I3UTx|ib< 4vӪ\Y<Y.>'.zHF-@Zj RS&)r\&oHPOmaXx6,~ o?"C.7'E¦1ijikux[2ħk|Z;?D+q2 -˵\89xз[G9l  ӐE[C"^kEWחm&fM#/q v}LgZ~ޯG$[ /j~0~gv1M= zGkNy^L aeof[Lb05-7WoPŃhÜԝ~ "xPU:T"*V qV;bwz;ˆQ8X B hy-P΀YtJZ>D/RRƹIRA$VBHψO93TZ塘9]}DjEޓgZ)Y'Q<;حVGGL)yZbsi0&1YcTų|>T&Y d}~ȸlRwl'GO.8ۖݧ6(OQxŅ kY,2ltRԀK,'"d>wLl^~O:<_7{ߦ!pLmf,Wgms|b4Ml<'횆bg<ݲ HtE턹Tzݱhឫep>\CnVܲI5 Ԗܽ7e\ ˄ 9otd0WT4'm-%Kt$Qur9\s$s[_T(&e"\HpL &}6#wZ >AԄ9#55DAu<O־c%Hvy_^ W<;/%&8R| B<3YGM1O8"{O'4.wF9TwpjOqp{\/tK)dI6W}Q8 'h!S8y 'q4{W`C\CɾU]lT\B#Bї /pf*[XW l>ERȑDFNtW>U]uBH ^z!]&D9ۮЉi >~$Pޞp'AD ι+YRs<P{i4ykbL*)pɃ0e{Y8Ҩ쓨&i JP\Xdp ^kfE4 gT3u 9e"hai)<1 <0"Gx,eq'}H{rYJOzrK:яz&{?䆥:zJJMZd).{g.wZtl~m9I "UX8\ⲣei=t\R&BqUqf 9"Ry&QD1l9sO˗Kg ~?C2e3>yBՂ'wD'xK5{y1mmgPd%h%|XONGϷ7'* U };=3 PUBMKI].'+H^`X:=qOijJBQk i;T(bv^cZ \YRa֨A<.j[8  wUbZZd qw7[ܩ^V>Di P1'/02 ]PCJ+) y PkH\kJ=4"X$]T_$с>ipTȬӮzEptHUU~klnЙ RA iEʶ^:wdYbyi&Vhcy~z?Q=;Bh)YM+&M$"i@:Q$idFȲ` "&У5跜gNͣށl˪O z[G_J#u/8BEpҐQ Vē&AcVȀ9=0". AL&R2 ,Z#ՁHtMkgJY_oD(3Y2b˗}MIZz8誺g"@<_nGeC[ uRjGқMF$g@#n<9Mk`o:kd4Jc,EVG#$YҞ-U1EB)m\8[ZA7JC}uc7y_x/YRj,K(FlG۳rl-A*kp!.eS*@{SP8P!:k"OBRq.$Cs!w'r!w~ReFiy@N)*ns.NdV \0I*'i9ϻd Tk8.P፧HYsn c䝞zq.z6;Ajdeyf[~غ݀cy[|;saN4\-mq& |nq%s~XpJ59Na}rz~QOpo\t8B9SLM֒D W)״,tByt\+n+[췶:p<_z60I:޶rN?-9?/M*U4o=܄ N~Ųr21W8}ŜFUMnUX~T^x7]MxHV07[:˫gBug^Gum'v!Zꎬm n6566é̢|D + {0b{67=/ǫ&k[eV/M* yü> alR+=^:G?+UZP:anu+Tǿ!]|xO?ϟ~xe?_|8E&+MVi6lѴuj)]5M3w1pKswC{{M' y}|>gVK1\ ӟ,~R )|M(!Dy!f8Ñ\*.3@/[_-s5>RIO\r'IIe!"Ј 9I` l2)—y烠^4D{VcФ^ڨb4,P2X'K.(O,B:&kϽ}Hy`Psqxr쳆9Gun )`$@ 6F~Q.ZIPD(F.F 7e:eXG%Z@2@sȠCSt6b\pʜ:I)})aC2S68(\kʺVmzӺ2+]k|aBfsGW9kNU{Ӻzy2*?bh#xx0*g+'KQ !s.E"'q L+˝PٔZs$~Z)C>TB4)%*I% d\Ru#cm|vwC~uЮ{%1x@nOǚޏvOG{]q2_T|*qw\PmllT2A]L;HP:x()M(V&2̬|р# 23!&.ʺN\H .Ht) K{̊RY+A=Gkm{C?hd!Sؙ4t@C#Ń To.4oJA =v#NbJM1ALrSesqҬiZEG6L!FJJ`f(ZuG?7 6myFIܧ>aYA}p-K*|: *i*i"#-͆YDrBf.^\ %ZJY,1,A=1JFY.3 g̻\ͦ޶R5W1CďW׫vs-^]^|'..MrTe>\*(myL%;Ӟia"eXENH40OcLE4B%aP R $P1*% EDd5!$mȮ#cNh=~j&>A< qt|Mb{7˄7(_'No(y HDM"4 ]D#7w{j.oq~ע'Φk $"`^vYD DkI.UOmiC5$@~lL?#)M58/j}y6*d1۶RS̾r8]osnxSЂAwq{[`^ln& L-3's$,uB[kwj6aficsIZٯ"@7Lвnrz Ǣ 6Ů-Zٱ]|1cXr 5am,Z,'-O)L2H"#RMYRoN @ϲt4CyeMewJgFν߀xa^DaA)CF@N;^j%O&Vk;(*Jp`gM*wW㍾W8 ;zP*qŸa¯&]r2,B_Sbyc!vyҾrQtU8Z4ywTv?dutv<-8?NI /ߗ ܛYz8e]En>JC5EKf(kOh&Yd? .GأqEAJ_DZ'jB.> vx,&nZ[bҵ;$0 Kmw?r7] 4FOWmʯV/juuF\&=bfʈF%Ljڶ6nqt;ݑZ z^p?ueCWkHpgZ.$pg5RH)oCݠ)hRfHqD#3K_$#uHΐCfz*wNZB)-2L;OF +xĨ2,јo7Yć2Yy]HZwǣûi[-H>K6Żr24`.TV2(ɑ)/.zr֥EzeR2t9ч]Τi;MuS AjEN BĈdTHWgN`(C <Ǡrf O!jZMR&zd rn<(6+ 5U#gZ̤C;s$kN@ENsB/BCkLz^$!yDY7Lgɓ_Uʢx>LC,HI&3ĄgiK;LR"1g $sYI g'*~D } DskUO' AO90Oɥqig2cI#ХOJd 2Xl)'PVj)g\ ʹg į K{eDgu},ezm i hPGHBqp /z_'ƍ%N8md DXAfΤeMԉQ$U jq_`'.~"x χ;!nWj2kUʪK!mA5H(II\q6H[fd@/z"]P/Ȕ p!-VH!'P.HCģx܁KEȫqȡa+K}77\/4mG&nFiDq?Y9$ &lKnRf5O6|_%EJthbQZK - (Rx%qK t%Q)) .{dm2Lj4DRQbmqH(PH.͙)Ǻ%R ],x bV\gRDI&Z[aa2.*SݾPjZe㐶|YcEEoWM;jlӋ}^ uufLO_5`U25Gl,#Ia'[Y1:|lP< HsV9J]F`r4 M2<L4Rȴ1 l]hRr)iȂS"¥y\Ef1\]C$KB7uॺ(ZpoQozk>G;ʉ/Jw:}D\fwtZzzu1.,ȭͧ2͕+@hwK;?&2'6CRvUhnwO9~7gwzۦ|y$Gk-l:1I\yq뎎K*/#6-w龎k>l50gSveRڡX`mI\!8rQ˙\R QߢÜ%$H] *jQu]] oQ])ZBVQW\ZkꊨD67 yRY맵ϣ[.KVnڭ55|#$#SLѿwz4l@1q*Tc`L dCFMb_tv-jPiTմRq#uUV r-juU5P|h" .uUWWDe|PWoP]Ym8Ҍg嵋Cy:ryaK|Tod͹oYC447J`f(gAPOVݧ;g J`3kX""\3g݅J5or!2{?K47݁)b?t-*iB2pFk^"d e2eѹPr)kfȘIQK#yddh .2;)|ٻ8ndW~ٳ@Zb,e Yǻ9 Éaj ,(Ɏrp)\4XjIcD9u]*ZItzڄ9v ]WMUӒ{w/~*1w/~?~N>\Y䭨`[v2WA3Z`y PE( ++as^R !2Ϛ7{Y5kƔ%'XWem>OO[kдPۋȈl^Q凿_ցLKuee;[C.^܇/,Mz$Py* Ulv/hl6Q蠰 @Ђwggf]ݣ>;^(Nrr 2gev(RMAZ b'9D/0Fpo5^m~yGhZ*.Z}ywGU)~V!C(%ɖU( FJ'%cۢ,#oZf _uoF5g5WREf1 B&4uYa@jaN%W>xe4 95 YuT%FYi:cOg>m:;YW~lu_ Ɍ axfդLx(>YGFRXHyoWEpFgR? M\+/*Փ}+?i:#{!vS@)`t \^Mv62@LHVϩHmyDxngX3z#}aeVÓ]"Ę2r0xUDd{K1nLKv4* %=c[BgpߢbVkYI6Y{W oDͿ0z[=eJ*p%E)MlR F&(J &`9>ӕWRpﮤ]IJ(Vy֠ Ͻ'3BU" 9G6E-Jj2S(<#(yP0&lgB i2 qjoَSŬ8=گ@trF0?hl)$ʗ u$13/B'Z93/左Upuλ5%eWH$ݷMtvL oSrk(&O rNE I2Dפ[V56NaɃQ(,EhB]YR,4h] 7%ս#rN:[ߘ@k櫖0m0h}1%m+*j06:BB:+t&@T/2*58 B[48| iOxDZ-j2/!}fXrQ7gǥ %uMY6ZԐEDLR{=fp鿿ss9%xW桦V-洔'-5{}Bh׶9_>9(tHe j(JQL" `S> 0UICPFw7/Gy-A^V<w[<O+}<p]u$HTMA6`\\v^w3Ѓ=xeC=.IHB e J1Ѯ(\=w$IdBI\"B$ b &8FY39u`iȞ/6퇑7ޜ8L6.1eP~C1##cgzM[Rpn%8@g>F8 Cyy*}t5N0\3w`cDBn$ۣHS$ 0Q )g|IVRls6Y868)ڜ}d!P,]@}ZgED* | ' f6eWX]׾&kv^낚Ζo^?ߡU4|nYBnzwzzzANrf!wS_k쀧R1^7D n_[7̶֛tj=l; lԺ{VWwCk-7t;5BI6UiWt|>Nyُ'z i.lMڻ66OiIz_7yw;tV6d[mcucu_Jͱ:[/, 5ǾcdQ|$ *tXf"YSb2Fzab K)P=zzϾav:ۨ D,DVZŚH")(AlYz]y$R"%X9;^˵+ /C$%d.rD0PXYb6:b4IR J7cҩ(Yݢ*Q['eh#gFFaF޲'<YnضhwO=.zlBhj+ 2+)e!P%V6NKR ? _*bq ҕR֖g"DQA+_or&<0@-30Rtt 4(UR*Fր։(L:b֣rw:fcBQsz\Zg>濾#~Y9lZ 8o;*;<+^VaIw!jr/B9QH vtÔYucpl TO`_<8-/Pf$_Ycjϖ;#LWQ-QXn+.y]{`5yPKԧFwS{p!_N}+Fk~ۢV|yx~rzQ `xvwi.{J\RZY| FՕImߎhz4o^ODU?\->xnODSɜ?Ɠr<~4YEW;$M/pWL|=Γ]#tN(ìfٔ'V>]Z|:9\nj|Q/fy4-*4Fo:S4뼳dxǧGLaaz];ݫzXw<O1)A5?^[fCsՋ 6.C5IÒ;ƽ>ا5fqp[R?~dzyneU_LpUq=I Od~.4⢋vV['!N1FyWa|¯+Kl R#~F522F!* ՑDn' 1 * H`$=a)J"yIB\L^ cBWfG{ P2Cr %"jxuz9٫WR>G;_GVիE^Uun >~4B)6HǬui;h- &UXCBJa:!y @1^J(QHH0FSB{1ֱl2z{I' ߣ^JlY?YJɳCO'噲ClU:z&tDv(7QGtW p@=#ъQLLaPl{5!ĐHb_I) @[EDm,RJԒWDUH+rZE-ƢY~ ѭJ$ZfTy_{Ύcsz6kH&5b.v;2[лv:c/Ώmւʟa_oPSA0!H)S*+IS+fT @Kid2" w0zWkAV\dJlzf*m_w⬯k.uDzא`Smy^nb,,iG˒J_hano B.4֗h&hr 6$a$pMHfBkɄ\7!1(3cJdja!%%Lޠ!  cu%%p.:d` X RabS7iYfŌɨ3m:;>AN畽5sF͆c)E::s-sS?%5g.D'`# f*=^'S|OK9ٟ@l. ƉgZ7BSnheX\Kp`{v36,K:$~$[-Ѷ4lV7~]լ%DKǙ-%%rۻHBXYV2S,(ģȿE-@T$vQKHc@3CMPh 3),8,Yw]73rkjM*C7+2o< BwG7ZG9aFj| \.~{}~퉓ҊsRC!:u|fKQ^ 8㌱v"KEdT;}n_l\%;޾y!sOf,A?Z.qO.zr')9kdilHeYg7nbˌ4j+^w8}({d}G^v_A+T`0ٹ 풪u{5Ҵ@*O\|цfPQ*o>veSJ)9SZX/m0ItkYϕ򃀝:xxΌyMc` \zg%I%B]"&n5WC.HvDwmpΑ!yTHI#9U(1Q.Z(PFbd̐n(rNM*QgAh>C!Q 45ѾhY/tRɢZ'_݅L3Ff3R]>3*$FnQTkC1"ATCȕT\>DNV;-"J!$ 'N"i u$R*PB;#g=2vtqƦXhz,+,\*}u5/0\}|/z7 J;>Ο߭P A БNg$NXMrJvA)LpHX6~k!{60ceRy NBPжCD<'yosa=76p9ϽITA3⩣FksN/SRE3"t_I*Sk*=ϭ:x$XD'iZV>zTL44A?ЬCYT P{IO dbpQ 'V\'s&Qjڢ2_mw6V[dOn8 Ol}|s5pe!KcL'Et<)ZÌ'(+qEifSsUȏiSB/DҐG^l/ouYwT(( %2GYLU,9NCHA~#D$yez`x`("^* x!4DE*%VB4huy)L.FfEZ.pM?[/ijwfB]A-'4XА|< OV׭d?"ItA qȱRPIG9ɽL$FqcRR<%?{BЯeb>TBq:umٝb< #} 6(\U+ ,;2Z hǃ!DE#@R),%QzߐM&B"U<hr'QϹW*s_K/ad-w[Gϣ! ˆǓK !dԞaѠ <CePzL7/1Ω9=:v~t9Dz/|NG OQ4]#\tO1U=Zv>jH!oqI W@p_Qiuزe{\7z@3*f3ߩjk`S΃>]rgz˦^qCedqV5>6@ko ; 7`dC}/nl-O"ԣgϏ3bAf:wfŸ{aba]s)_6/Aga:v8"\O8z+֞oy5N>佌h>q?$u͟vg6QLQvbq@ A-ooMW\fُz1-Om=$m.m"%2E]h+f.pW׽^֏=OEǭw?÷P}<Oנ:Gx^PnG#Dp<ů a֟!+缕0E^}XSwڥ]swƠ,l4>:y`e5] x -'&\TX%\hwl6F:eU m>1ImΪi:ԶZk:6aa$I˻MwY])pɃ0ipz{7xn3)B*S: rEИSQo)?-F-''˝e)9kdilHe~iS @mLKɥto,Lne*Ғ2O4`|o*+2J:\e*)pjJ}+$>VpհpӉ•J>-\i U{WH./p2^H%׽«+͈Vi~.Nֺ|c;2[ɚ\UdT2hSRcv~#zۂύ~)̀s/Ѹf29('6MUN6(ζ1Aǻ֛Tq9b!H`\,o czs_+Nw9h]}7Y~ZSJrRAIUSWi' yuf?[cWU%iXD @^Z6Kܶ Kf*AB(ԁۀzWW2|:.ې'-j}]淫O03>7ҔATB2cI@E0Jk[udJ[a YK| Fi< tNpgE](Cd(ʤ PH4;#f|3A3|(qA+ȔֺJ}vvYfzzo]VtOzj.x,K=k\|g]>(=Xsm/F.TBLPcuDR.bG $i(&Ӕ֦GoxPi,o3%QԕC_^!PVϣDD%Yq;Acf(d89QXK òKZZ3K:(N%#hBDK>50ƃS\" ߞ۟kiz Sp!B񰑈aɑpLV1etLCniL$4!I"$Ʌy*M< ޵6r#ۿ ~ȇdv6Y 2ٻn"Ȓ#jɲdLO[j6YR÷M#a0)>E*ȫPy]c8}u0 n͜(!xci/zgQ^r587sǷu[|b kp'|>(zў9AĻSD#jȇ5@r9z~Qdh2EިOmb­w.?};'!ɻĪC7s&ZL2\4%CSŽuo(nB8B {7b& igx}PZ67Ey 7ND6 4tP4(ő'ρHJIIO0ǼVχp71 Α7@G8X0#M%A%PgM":rؘ)Cnb}jmbQn bys b,A, F䏍[rfq_,-Wi/ ! /n9C 46Cl|?M/v2 *댩 Ѹ+!3ҜvafQ~ct|.ktfһlvnYsݥ}Vdz|?_=CtҤ9a5q#CP+"$Vk,8!uz1"<Kl$}ԂEڀ B2 TQͺbl6!jߚb9%A.i&3X}\-&O'a WXVްɱ3ɘVS͓θ IsvczDF")JJpk46eLGqv;țr0pvG9Ᵽ,#AP$pg=,H$HpPQ?"s2% est̑ 9(% i!@[rTB!sEgx 8-FػP!8sO[?ff6'z6é]NfeJCZ}RK,hRPhL84Ó7K%VURuip&Zq6y4Ȓ ̤֌I4T*J kp1q6kJh4Mm{/Vgk)jr]YdYjiuuG~7U\&S]J0V(ECw+dBRy&$fBcdB$KyR 2rSJ$[D4ڃ JABHJ{ERDPjL0YEvT(R>4<j fHSaZL-6f|Hiw0^* qy8-^o8(|B>JIGc fh lwdt||)Ri?[/xI ^L%U䯜E+#D%%c#҅h) ƃBSi"mnx sqa+JftB$1a*{ 28ۏdb j;6%eQCų2 jd*3iN JR9鰢VTa; RVtQ#@$AP $1g!CǩШڕ> _LxXs ǮH #"CĵNI$OKO#Ps< Ya)Z:)FΥ,"G*o %1Q NmRrЂ3GZLh4P:1q>=j:]quSb@D2n1Jp7JGI))A{.SH‘@CbcW<a8-G'x^ϑW_6KUQ=mvQ6˹a&0'y: 't9 g,;ggvb“tJD*K01gawA4yAFFe%< T%b9VIᙳhXNxoRRƹt[$)}Na@DRBHZcLH{34tbl6[ s-5q&Urڐ ~F2뛘rGg?El>%<;R#>[l2VW(OsQͅIA29~ sȰ3IS|bW EzxrY6HAɏSOJįnz_kb:/>$Ň~h=yd}eїn&>LngS2=퉲\ @JANRI/BTy,7øH@BPZF"BHN"%ёXF(˺|!L|rL-'ί|8۬\ji"Hw b+3PSrD#:p)k-P-u4~1snU23OϠV/Gy\zKW4&lR Y´dka> "F!Ot݅A|!h 89^'b|уdWd$+,~vsC_|a?6ftrn qc~ߛpKeYy*{9@PM܍hK:,p7s.ǹ6TϽ3j}ͪ|)Io'Es44}V~>;q9l L 5^wAbS,m![?dٿz7)C)'(Y rZF٧Ņi /]us4$j|qxGT2:9wf/?.h4~'sZ1B?Y:fB{鏗>g4vxQy}e06W7y=^&+)ZCcMZGZwC5J״_\s7}{4y?~V_8]|- o̵tWwK<Άo6[fǡ4T8LYy ;{dswDɹTzsXl8P2p1l`]|d(-Ԕ<7XTe`;mj a2&fq'w4XWSv9eɠBXVhv]YNL:O<7IKmab6!@s#+J![iЖ$Q(JKay= mH0#`>DٻFrW}ݬ$Fqo3s)\m^,,L);;) >~AL,9ŤAdLL3u i4gzƱ GUG|ԎDOND̉hdבBR/8—ùV 4AVgsяiBp~5Yð.zBYkVyYeUN1JR*!KFVq-}J3wQ~m҃pG.5uf;{į% Q 7|6As+ m £DsZSdl'M6X[9VE d̜Âmcv S0& p^e{WKy+/lz1z{~- ~h&)6"{̉'URC(4vc~y[O;=ޟFqsDw>;)Z2w \"~1axLY7bjogܿoOg5闩Xt4YznoB9B{DKD`uIתci]Ҥ7%MJS7غD;q]PL9a "z,Jk!RH )V=@A WeW">] ?9&ј+,x ;3wNm /[m*PecS+t1S(ԎrVꚣr+*L}@М΃)qH#iAfuv8~ CGNr^yk&ydM/ 1W[ϗyxok6ڢx3>'IYBZö.lY9BQBh]oIբK6\AOrGWOvj[(ժȉx l]XMTX#;-k<)-:@`B* HJ~6P3@D$κ-W`f0(j+V WZ:Td!.P!HF9L;c'Սjوh^׎ ;tqJ!*Y|5cZA*AX0WVu 1Jo]cT$A$oG@R^%䓇 VMB YIe+ϹTc٣_DDd|C;//z1"c1uV?*꠲s\Ф 1UW962,Ei2~qCn^`4P# E[Xsmj[-7鬣?]5CnI$Z~:!aʍ{8N ݓM|N)):*Pjxd 4(+ ѥ)%~-hΌ,!і z%b:R6uGB)%(u8DU@R 1$YQ6Ph`*RHgC?S1Ym}!ŮUhw::l9TPt ſ}e)B674T>ko5V4YE!e,oY [P#`[oW4DnKʃj{k0F1kl|6d9rZ T4rϘd28؈ls;17|vwojAS8֪'Y\y^g `] Y^4Tf@`! D7zLMqÃw;O󫳓vN%eowO>AAP_sLdWqX}Q)FYA*QWDT Ŗ _%ML!m%x힜 }<]67&>iwC'_x~r%U,!}wrr8'߽o{w-&}}_k*gTnd~m2<(Mo͔)_gе g2$R{-z!JE{2J%бN[4 %pZ>UXC +K$˙$0q*񲫮S:: %:>(/ f@aZjKYg=gmոW2E5w\~Z-bícaz׏mOTeX=BsM`GShX 5i{&%٩,4gGW"p<jz8jҒ;\Wos~p%Xઉ t,pդ jp)57 Wvvr{Z~ɯ]P؛'}p+_N./VywܠDx!>"vNv8nH5vރrW"st[WM\w4NHkYe&zp##UYJ& \A S̮ 6q?i±UROpኔ9Wo/wNdi_e~o2^ym4s?Go5WeQ-ud%]|I7q޲yy'oy|'Qד9v|ؤ1'\rqXN}טx!R~<[f3'ְu}p{}Bj@i?>3B_z:.GLDŽ͔;(jyW÷BKZFZ)U%Vo}{yMGT}Sٷtn?f#;AN/KNNЮ?Zݖ>*gX U, ,۬MlIjh)QJ/P\#wFb| 7ZQ,H_(ѷُex|!CLp,Rǡr(0C *C TǷx I#Ju<=QAOud(B1Ve3`ѾM&B)xUE ˧rurNk]̖"UKUz:i ! AV{ש8 p Gx0Fl./ s3tD G:EWl~&|!)2GT |Dߚp4.4ITՔ&{zݨ#Rc:rD8X ie´7~qop0US䓠6j]cL{!ApZRXmA`0ya: 1VEcey4pX.S:LcQdˬbb@*w^v1;/ŷRUi d#Hg= 3Mf|\\;7g oBo#43C.>#3kfJ]q/^-We1 7껽˔;5fbezV (`)~Ȍ"}yg [J=`^xW0 ~>I_3 3'e7qPuzS`lOC/i`:p ٵ*F*p[f82Ow? * [k/z>ztFIspi1Ȼ7j4Uh_~>b޹7|*'什h35==MNAz^-o>-34D{Z'ibfɢ@5}ΒP}XSe J{&EZ5,ŘfUKijk;]`#ЏlT|ڈei -}hn>_-q7l)C#>b2kf w21kzTK7ņ u^.swoZxPIvuUT@7 PUj3yo 혷1iӒLTq'yMC^5XTqciםdЧN<4zhWNМ/6I+I bnSyYŝ6GƱ]v^puK%d4ZS_}wPE[V.gU #{:H~/oc6jߚvC a ˭ls9tq/`-Xʐ%y(S[)`(5sJ(IwCP{4t8|ו+ق*uď2{uh?7C[R(a;m,&돰dga@i{ a6^Nj$sBa :)Oq]|}۰~ޤ5Yɬ의p/ӚxOR2/[L-E^f\.xɼRW1wܭ^7!hum7 m| sU܇]eī'quRU(X>>/)G;clNq,beB|<>yzs<8Giwn4 b$WjShqI8ր1-2eJSBQAXHZ鄒1 &*P j4!?13ˌ3Lֺd`daK/]kvM״_߾Mk v FX/WWEEۜC:켎>Ѐ8RI B{b#R H##;ms.@k!^h6F9B3,X`ʝrA |͑ĔPX|fP4TJF9`UVu$rŐ* "-Ykْ f5/ms53:3 ]<c9 $s%8]ߑd_ Wz 9/(_ "^I"̧ (+pIxF-TT-73Iwh&c=G !)$HA2F^)(G+cBF;$ɾ}HO^ެo$jo[X)+|Zjf^S5wQg BF'7Lt_5櫏lFr8r.JB9CXz+:ӌSj${ elqv5 ٕ;TܡF'|VgGhH^Y9FfH$f HИƔ [Mi0}8˞h "@uBL<)>?)%'o+3};0}c8M>ߪX;w $ntRq|~ݭzևnKRa|QL =K +,G*Oz%ɳ CVp!i H<L!*^4-L /Y} - u{7~;?'a@=Ql\܆gfpZ 4y]eF ,ˠ%"AԘ}"d߻OFֱqW'jk}&gRsd{!Ԟ#!!x[m4ɴ8i+`u V=d\+qXJ5,J&ĵjV;7 #$kM8mJmruCP q`8-JWOpfuklVPr[%ڣxhL?0Beo DobkNVD(7 . ap) xy$ViǬQFYXQp-j^&dFj;[iSBZ"@ J]LQ% ƕXGKMĆI!&Ne_P8ݤyj%M :",oX"2`W.!kkHE>ג)E 0ûP+ܞ-xn/JZ C_n?qayy !P0vY4a#A2ߜ OLj9+UXp4NϺHq_AyMɵ7> xt?t_4jؖ-VN{ UrzaIvGqtjDӜ$s&ϭ1Wd.nkS1Z܃3j<bQk TaN0RIh¸ D*õT[c^u :sktr0-j4sX|o{`)rsXޱr>`#aʐ``K & J)z!PIu;Ā%1d7aQi0޴ _p0Aiz)?I8 *ƞ\<6 QiVGERIRqaQThX\GBw~B}'@ 1>0 aArMEZFM"I# N;Fjv J@\ XDN!T! gTDa\<5=ltgLWxVz)aj =Y*V*5pXhIV_(b+d~MS8 v5!ctp"A4 DA輺WVÀPZX.*n+.?øq' Y>b|f`>~ó9B%i{^UWȡ^pv>>\S9X┡mSkbzb~),EZ*w٥7Jgeb? W/WōiuhY1&]t|rZ,#$ʆYQ8[ 3ZҸmUKM͐fh7ge3ˤXևᨐO0i*>>=hޟy9ZFZ*A[_զwVɫjI s &.U=l$4F`U*ZS?,og?>rG߽z>߾8__xu|W/`qx˭:-:p~ܾijګٻ6rcWX~IcҸj6^aw+)Dž5ErIZӘR#Hxk˖84Ƈ6d.Rv.MOmjíoqi8o.t.3?'?zDx WqWPl\M~|WRѐ]ς 5h6W8P Ix߿'&퟽^!Wa37T"b h9tّ: e$Rr@ I6,1 }@l#o VX$51fZ0= 4@(EZ99t≔=hiB*M^/=RWTxzmZ+~;lt6'v꾡'_'NwbN\G4|mƚ.tg1U7Uo6wL]sGI^vsќǘ6/nLuV񮍥~'NUgb^NOOA'HrYxM%I)7#͟˜ɼJhW;쌯x(ʌ++"@պ 8QIW8ė\ϔǺ)}=SzV>Di P1'I a\ eA`->$VR8`Y;Nzfܦ=*q:[nw5Hw5nܞ9N4ݝ ,bf5`d\9{*gT djZ7 LaS\{>{{&9bO\E2}Btd $[e>k"^'F$]^*g֊x$HʤTs< r) ?vdB!%R-dëVݮ;J/u#tc?->4,]Gķ(EӷG^F \RgR;BބhH,7h&sᾈ4ONӲ Rg"75" \ -=)ZcKnqQtRl_kyXdzd6>;KFOvcWGrGys, gq ~볔azA5sU ϲˉҮR6 7J3)ןEi{(k;k/Y{D (-h 79'2F+V.A$4II*[$LZ{-h*-Tc\k\ ! ysNn]e||+Lؔd OȏmuB{9}޼~$ڀR qcR^mu5+@bc+M Z^9l%)AY4"$C|0(ܳĭdHY1PG++j'CQ3Z+-@T$ejTA҇*&h`$ggBL l"*G9KDg;SCLM?_g;7DqB+p>m_oG<Ԟ8)HL5 I94hCA[?j)"K5g1_EoNi> Nۛ:X7Xx5oǾa hb8j}07goǹf9쓓qY;0kv:ܖcz+)1\%yUNVˠB4Μzk>,Sف?MP0g BN9JRi%*Pnr׹ J|sg:ThFŀD'jAcz_z8~y6v%%&yŭq>y8yA:QG\^Jeϡ]Q8h],f_Tο{Г e} O%(:{KeKu;F ir~\t̒hZlwȋAI7u:7pFC4&8y|hӄisFL Άeq8&9,%={L p@8 _jDH -"T'r™*M<0/^vtV&OQODH_>|;j^^:ڦ^͕G f_NKoD>Rl jCf͢B_} Όн3_mnAĘ>,Y<}LPzP#P%Z3+a`E9k>x(u&l6 ?l<n?hm7x7w&ŸfjAnziW7R_ &./.xy_ di<ˆt\֛}:۔ {Xw¼z}3śd۷Yi^ozCюXqjЂj8lFonq[_3AS@Spp fQ"H]wWq^wro߱[>tR;- |ckqTXkhSlna:r#|˺8VPLK+e#۞ e*RJv~/1{#`ػW+SemTj^CtNvbW+ a$Dsuwt=&/xFv/+s ^BўWmRNv*RAq!0ɢy4DFql#*J+\^T[f4H[S|1YS5c"@3B!B1N7":< rsZSN(vcW>`q8Ou吷~ZyZTn1UwX;(Vco2D 1NX%Tvy[NWK甠6 X4Y\s6PZJթ[4YJE - ; Z g܊2~N5W7NC,ȵyg>鋷USFL v {:)zeV[=OHOYiSe$x@uaؔRӶX_ Q#ײyH߾|.%x䞲THyuUQ7g{(?TpB$R={< \yT19I` l2K@^4D͠Id*FiSuPGT1 "֐J@mr!'cpbX^>,а_z9#O!=*'NT0S ^Q 6FC5 m"D贋1C #F:*q$4 :D<.ŸZ0RDK}ڸ,g>aM-Esӓ%+k*NEԖC1o*+pdh5} :WmyS6i&joȀxw?1TޞFI9}*ֆPc(/E.E"'q L+˝P(2JR9}R8e=$chmX[J2.XLXb/ W->ef!na}ݩ oh4=q6  d Qh|F-k$R1Dr&̢5OÐ=12lr'!(fJh!$&9Q7*]d:&橠v1))zFdf6PT@2ddi'Ɓ^ b6aRYlGP@ != &D% @.\1lَQjT~싈0"{DQhpшj=.h51/hFud$nm0 ru9mpjT,eq D%UPN(q;P^VrDpA*і\?T9#8f!(aтL "?eOsa=76ph$gSG'Tj1qsgDBRUL7UFñߦg yƬպz|();PXO,r],%]xր$- -]΃)`35FU\iWPDDȴE}ȸ6~>9۸cqm,Fi\#QrțGqv{%>mk0k#x$k91FFSOQ.hQ_x1q]6 &a#;RD[) (&Ύg\QzS:%w\p>0G}.mv dJ28|ګeʂ֦7Gd]ϝ_d\cEDX4\5I{@kY>9NCHAD uUz`x`PDT" Bi"*)4T8xZEbD :\Ҡ+rC$.= M@CNjp&8_߰dZ_ؠ8ہa`=w6\SOJ9XAGIpKY@3-q= \0L¬Тg]6 Qkn`7 ;K]U]|6vWq# Irr%Lic\$,!ЂJ="ʃ^Kb0BҋNj]:[`-}j??3j`Z2ˢbU0ج }^|i%K׈ҥy BM *C&favfu8&'&ǿ ;g t~e6Y34#W?g??\WUs]o8Lܨg^\/>Cj:[mXfA Bx +x8pw'^%f *}C#B=f4y;_>¿MQvC٠(<^z>(+n@_OfVKFc,dfa~^=5[<foˌFU]>+o|Coլŭ7;޽ d{o4 N>-e%IUK4?X폟&7?0o+6tL> agfR\RӋ3_O a]hO;YC<TK`B֞\i MrU4q4 }'¢jlqW8 5ޔz}pЅCmޏl՗ijfɷ0b<"`ʡ=%b?Im)á}if2 yL 垇PMW5#TFͽClCm5LJ֫{Rofސ `1ocBLTq'CsE^Ѵӭ9ЧAta-4ZhwN@Ng7QL9-_R(刚y_PR*wQ\.^N 51{ut<@yN ZgK&Ƌh q8hA9&z}L~b a_1(zѣWRv/$O~^v >( wʅpնߥN~46f'D[]t쵲e & R &p[jnDh:vF.x$9qr>r ڿ!f ୳ s kQ:uԋbS =gk("\ATj:RjsPlDypaL mf<|_*ΨuOV7]fPx1F: lp6HF7g)Oc.*5X@UrL*\E OO p} ڥ-2PX|_g0Ʌg*ۖ\FOr5m5=oc=\xb]Z7iis*OzdORBzq#<&\Y3ʄ2j ?Zq~\R~R;$&r6 1ҹɽfyY\R(rmT(Tąw u.)\R%e5 h8FbTSQdo5YG8,K qTCNRS-@A1=(8v0Fă<Ά ǘu )O)!XYsl4N9K'љ[j"6L 7q MxyBB^6ZgC)42\*밊a`\Ȕ+IElH%1)%>7XF:QWa䳻ߍ(*Rǰ#,j$p(SJkq: 1yxl\J#CO׆o17j;ZD'2rqnx,%`)qۿ__dj`2 kK{"柣o7]3Q5[cN_ζIMeEMH~U4t$mOZ/ƾz-ٰNe(vt8P2x:e9_O'؁]{ɱ6$Qs ;Afx=yd(i)d<v2:>?Fcc~{t'ĸfXBO(1C `Kdxs1[Ƣc\J\tx LqY̙C:jXXLjAYe${HX'O:e`k.P_RtP՞<#,R@VaJ{M Wچ[ IK:Vkf/?WM[mP@PWzUi2?D{0{ n b@2TXS%L`(*A\z Gs*BLGB*C邩X1c2b=6MṼ#Ά+zU]= pm{eYj{]EskFKun|/Lf7t][w=\}浀uBmvh _~>1 :zfań!:Ρuu^^=Lhy͉f!̡e^Qݮ=/onszzk?Z|64?"I W^ƥ!+5zMt˜4ϯnE[h/ʪ)y?sMϦ D|B0`ɜ6}5Vc?EMԤ?EǷ86cOЌB.R\oʐ%y(䌧yRhPzc~w<9G5 >es&)TL_=>Ob.Mԃ\1",DDꥦ0+Cʘݥی%x0:6?9Og5[{gN_/Jc"{&CD{R1 I §DpC-%0l[ˉS{"`zJF,V36Othg'M_JuH;APːGm{KWTpO k6U~(26z frzׁUj[R Դ7Io>|JZ'7 & tKIA5`7}"6aq2XhLBoy;͊MRe_^ܦ<.YsHuke+l6k2s X*_يVTwdyDs4g"4F~b ;LK '@؍j~.!c+ρyI7,f'6 )#cu0ިQo8> S;P e6Ĕ cWwop1̥Ik(F>|ˌU ;4kT$ڀn楢Ya96 }^/HcvzVjbeX<0QLh0ZA/c%ߏ|9Vm9Փ]io[G+DvܝO3mZKڔb;s#-K"{UZN}9.Zg)kZu{>ԑIlH5Kjrت̪,.0ktхP#&Zr/fB6nBoƬ:1ZsS\%Xq3T^nhgV^oK*yyph:413xv+fKD<ƊBa<-&X\L2ZWKxfC J%pqo"O0VgH#3>2i7)y*W9B"i'pd#<"Kq1*Lm@>6)1ysNQe%VeD)js3zU Uc'g'p[#޵̇s]Ѻ9TSpÏ/PZD 萚XփZjanils#x˦u1ɔS,xB**3tP>q A"cF1  дe9UTGoW Kn{d`^F Dcc:WȔ-U60<9:f /8uTT,i<8 QT,YW&SSfI"QTcpmVeқnjE i6 ąUa*Q33昜* / +˴EQj0##=5P \rC@P*ؒT2j)")zg[[N[TuLrNS )2$ЄB`Pq-m-X.m,> vy9bxDj3x+)f 1h a!ƶشX,i`h&s\"1(l1IeST@gZ9 ` lXK ȶ2I_~NJ[e@SќQ3Dt0GQF5`UӖU"y{x)(`~E\*Z!Lh-sKKIg! [BIX*+vP_h-jI`UWuD8M2Xoq3nmy+F%ӈYUIc HQIbY"hpD9#Ⱦ wf7dF[U|eMB&q`<"N)b‘J{ iT28Ut)\]Hh)􀁉|pLqΠ&qcRM cA1J#H*H2+BJǍU}6xOyq `|Q@VǃV&jx[Q -xi`u)*@DUĝfE <-`9%^V0N+F/~ %dlgZ~ Mr!mڲhB&0 ֨H `f)U U嚦2+{уEXYG )-Ia. >tSP1ҹdM F`⸱tBjr6-V뽦4rk-@Y"k&nj!f=0p v ~)v`)jY:I4kHZ\@Gɪs7㺁)G-:#}0 |p ś !u>fTs\@7<"d5HMU.2R ; #B*|DHz;`;`}maM$>^ W؊"PD W}\'W(SQ: f( Z1<*5 =^H)JmX)+gU$q]FʄȍL#5XMP/tkO.(A I d:USEce u  6xYk$I(ѪZi#Ǎe7 tiY jÀP/jC 0)Q#2`pϛr;uks4ِCL*V5 :O&".ńE@tJ‡H"r K. h&J#DGGbkͮHxXd3#z-y^kqbqzi7 .,YLzo#Ӕb^L䄞?}b lJ`mS:z%9h,.XM]y+}^W5E $^u_jAAZ:mq6N}pn 8x \uOwO6K|njdUQ3 JQ2XBa 45lhjT2C6_؜wD~@J>zmZ[Fp3=ҩ:7ۏQd?fu~{\Ӭ8_ G&%H" *h$`Q^" VZ%F,ek/u!yp鮽,_l[͝ރȥ<MpWjQhi}qz+*KZ$i`\S IKBеX6^Y_{zt($pufڈ=`_M5qF'lvW@kTi„v΁ H:B爽YL{a,)xogX JҾCyH&jue+.nr,zI>T+5)n^_!< hô~r42|b77䡃 KSiV[zKwn#6?؛3ԉCAl2e_L<ӓiFCu_g\f7Ѡhܘn6GWe/ M/gj8y;y>z'x6b6\vg\!?/G7 xvУwl]79vױg5^.Isl"dPyy&ѥϖeJ|O+Qqֹ5立櫷'ivrw'?쇗ן^K.'zO?|8 FL $n/O~-oukoskІ;ڛOnKEGq"|2;Kw )_L~9{<8[^~dq|rT4A~Q3/&$ɴ FfIZ_b[hq 8 Yqt0wU<\lLx6cspnrX-6\A.fTc[7eZ ]zZsS:d0<ࣴ( qr޹3R(pA!f͢yLBm~bd P<`j-J+a`GZw ZZN|>ϣѸ˟α1w$x` O\px9s|(CTQPA7xe)8&$ɿ~"]R!8=@əe&jg+3V1D-w4:6°JeБrpmѢĦ<<UEyH0m塄F0Ҏ0è]^/cXxK R p.&"MAX9)M%'D9E'n%X;)t'НBwRN I;)t'НBwRN I;)t'НBwRN I;)t'НBwRN I;)t'НBwRN I;)t'НBwН '8$zO9L $ ]@ީltӁ4@'Β:ɰAO%=;K%U+dgM#ZY ;Kag),vRY ;Kag),΢ E̿}fCwܻ۽3ބik287iW*JѢ*dqns(M {Ej_8X)H^Ky|!j?G.#``j=L.C33ucC͠=T6__̍`~z7g0)|6 0M]0aFi.g0 µZ?}m9|;]?a kqح0"%V$cE4eF^tɌ\nY{(n0d ?\Pݶ[ 풩uw-#YZ)g.p2͠r~ծ4~pb4PIbXK ¤x^VȆ0{jE]W3=s˴LX}ɿwSBhBFSkI2"lSZ ]AΒq.[w/56F>Nt[FCMaa"a<Tjof5OSQ) ,:dQz-(fI,AaCc-m0!0B *jՌ;5I90,;@\rL?,yh*]w?~{lM'ǣ:(Iz_zO29mLવmS3^=p鉢=F"ҜE2R$5!Wb"D({KsdDlَllXM2B]pޣnӫd`nO6O6߶X^?m*{_ Gl2Ki ! g@OL8zDNYD>+bSE#aHΞQk % 6.q2/2ox ӁiXaGf֝mGCb jg=Qݯv`H<< )bCc$"'d T SyE(FYEB u:C;H:PǽF@{D|D ʊcc2Zi^rdD;(yw2ȹ딫;mg@j )+YퟕL5[ͩ*Q|nO 4?‡Ͽa~zl4ysgxbt 8~c@{~OLGȳٽ_ӺP+M.+M|ۓ5m5랔>LfsnbF4^.߮eh7/1]!MnK_7 &3m̹ws1Cq7s9Zo;4o:C~u]fɮkk]domn1Yu+&H.|=Že'rʂC:cBX6O<7U6fQ&(#`WRMAJ1]A!b5WKe%˂y}\AsdP [_yaN ZgK&Ƌh q8hAbdԋ9 N8R'GʈVZayo3K&\Z3DK@IbJG;hc/9чɉ&PŞGJ^ޕUw+$O~\2'.:jɰ }e1Ȼ~CVRDWA}`*TQ͍QmQ'(CW#ɉKbvp`ޙ>tA[\. х!bHEEFTJ2%LU CeȒ* *)09%Ǥ>SsDOOج{%?Бr`8L 9i$!PMlp $JkA6Pc !hNh׼D.Ķ9dY뙆ID- V,A#@K-ݤЋtZ#0" x@#&Rb&*+s^VGU':'@i)}! gc ƪ8 wF!2wiRk};&Udӯ*ץ>t5(s 7HqZ*0;op)>m%)5`2J ɪ + LQx$q9!ݻlG!SFH[Ν+~*0Y쭞i>rk#p [ũe@O?2+ i!)9y H:|B[ƩMg?N]DnJtҚ=CLڢ ˄0%͙G$}F k NMD 3!J턶Je)D.DM` լDRӵ5CB.q9%],iRlԥ.PDW 9D$j2N4&(Xi&&u!rIqh^N2}BB愳R[#>@ݚ&jrW Ҩ;<&qYFō $4gU z I6:.mGg% @:c7)~`&Ul;A9H@Tx)8g#a\)^Ts.G)U5--!TY$Q)ɰU^'>g ֱ'&<3ZJ#I<&!^{c y UK8Lt 1y!Y%b@J*fgtoHtӺY{VmנK#yZU[SҠ\@4L]|(d06mr}wW?{7.FQÕkp?DeEpmIPkl10s(!Ց SgTttLƗi->fI*R9* ^0I9yʕ "D%MTj$׋Z]kY>e551CI@x:gUFKgeg=g5Rwc6ˤȎYL{#ߓay+Hom( C@=Θ5٠tx=-6%KHj"J8'+Yb &uOuk~{@Or{ ,sdҀZ %Al tJ,ʽ[IΤHM4Rk.>J Y:l&L*dg+ _ae26쿌 u(_c ^6.i界wÑr󷽻~|7khh!k8]X9c_Q]0jrJZrjepϱ(+"GrنV5ly*H? $V<ŪvGe}-ס-׉bU/r'1n|F+ yI l<hI%k»vVO, CJtp0+ l2#9xt0%YZ kk9{v^gtn ^aW~5x~@Kސ/n2MWU^gݺϢ잮RӺP|z?t;6NjTW׾m.^ M\TfuҺ=mhevf!e5ûݵz^5\Y9絖f<՚yBۇ9{:]z Up_b:7jK>Z(5ޔx$S>SJ‹L$_[J@]LgRFaΕCZ5qZ3걛5>u; yUc:DJƒ Zpc$A(풰k)!xbF'h/Ē.6'B$kE̘ Yc2*l+F7sҙHLDKឞeǤcLqJl6=,:&EfJLEnұxޞYh}`BQ3#U5 N}L{3g38نQJ 8Oߚ6U3&[,+_2̛￾һHEAn@:ix3;d*Y_bN#U@r5;9'~2IWd2#$]B5 K^]X Ė`YN0~-hZϹ_]FkeI< qEra7O$'I|+UV&0XѼ`ּTٓ_o?M[f'IE\/QvVjg׌Dxۧioibi$#)}aD0aI=,Ѹ>h.)>]-6zsx5X9AnUݣ.4jZ (X:)i!e`MM,]8ʡqO`<:M1l8=9_0sۏpﻏyK篴K2;K[3}vNfUꛟrᴒ * Mj9 ⼋zVF[WX!N|(19Ҍ L l?Wة=% ^/!+IX1jȪZ2&㿤@.rbٴzx*Br FrXУ'%0/¼tBb !gγɊ3H&"FZLHc443fm+*NzG~ LCA6T}4IEBųCU>Rv±GGGBGO Pi%_\̿eP& _f#<?{cǍxNKլ ޽7-;n5w ]WJH!, Es=ƕ1(>E1 f3h67NJ$՗!`/$.^~\vh6k1/Y7-̺V++DrI` qc"`sd2[/2'Lvo~.pM&sʻ:zW*0:].'ϭB퉩hmMg?oYt|jjռ]97^|$Y]~7W_ޚTȠ1dBe}Εr*"gR! UTG3K*F e&%@B@̤v6 @ b1fcfmE!{ȝN#e-T]T^C:fXB#spu2;;9#A1?݊# Y#_ƚ['=mΥD#0uUtRN~Z}ghĦýr̸J%/ g_Hq\(F 9,\L(D(9 eT\`,K"5nU&"*2C('Z+JKa>mZ%RdmW=im:7pn}nH|8\5Q{pyTc|/Ef^MȂ1^TrcTS18w Yg&&T{myKjjdd/娶b7ն^9~ù фQZ4Zn^ШRXHš?D3iAAYS 6M\W.M~n]Ʒ0YdmL~8z,Z>*ܻ^^޴{H[ڵ fPCeS\:L˺~ Nid\I R!@CU4y,h~.;綥:䶥:F!EcS1B (M ^G jȊg%ewBڽO+J6]Ru}.zJ /)%ʑ9[#brK 0`CL1AIwm_`31y6\vYr%9i)[N&E\g8s>~<Fb0lj NJԀd{J"wJ[YJ2.Xl8둱W)$c[,4>*,Rl|~gg o}&F'pψm23$!h?(t$\'KB3'&ho^ $ʦ`%Ӑ=0AM&Iv' tJFbYn2 &澠v1))]j vY#r׀T-d*$hňq\2ͽ^ b.r1($=rRVgQ!քH8 $1!օ?ѨʖNQl81{ c[D8 ē G#u9D BV% $fm9R%)A63jT!eq 2JXޣ% \1(|ņAqWgu\YLJEpqE:oj L pגԣ$A9&7-WO4.pq_tl0pv$<6kGq 5nmzNpy?J& ~ Y.4P1wKUCͣPi렁iW!|OLj6!PY \*і @9#w`!(aтL "^ś쩓3cc IqT4Cy9y?"#;r` M`1YcTrNd}M"t!߸l~dSTwr'7NF]6>9(OȽb$gkΔ9 ȰtRH M^K(e 8͢Olǯ{չf(ggND}a/,6},GUzKMCIb'>K8(&D!q/B;A,˝0v@ RH&BHHR*#Th$ ڸgn ӱ n@?$#ί ?^5؁-VweB]2"[w_p?IY-@c0$jJDyfqF*jTdb L0@R竣Z,g-"+A0l`.w1VO?fOM|j4rvdǾn>-Wa)S<7Gտ/7EZ^j3p[qo*g (pU&qϲ_B.oB)B?v|9i5W'wFSOTo_&&ix7m:?8ꤕ|rva?nLW#_4| vͲߖxmiVM|ڳ?$̎>NϞ>L"zoL> -XОWÅ3udzѠ{}?W&}Y2WY(M-[)JCcE|z<gzRZuߔ_mU_Ǜoy?>QR_m29GSJ{ܾʹܥb7li CU1>d1җL Й+Fԙ Ƴ5}^m'lzCᎇe]LPjx ۄ͔13Y+&MaMz¦/) }Daf¤tz,Z:s2y%〼CFlNe-Y.dӚKᓌ3B-I"A ;-E6gmV} *t"x:@LBbm 56&gI@0D&&變=~8+?_UZ0ף ~pwŸ}<ᗝr}h+7{t}\&!ߕ`s WpR@*YHKaa"byVr|;x 9W'=ƻ W{ߟFnGSŻ\)Z$a8-Rֳf2 ŧYG\BCYB^~Wx6 ^۶ v/YOή 6uTj/5~ѬUQw&AsVmf|:{_P}o|V9bm)rppBi>NΧ{iխ}yWϘ6Kq'gj;BӚKNʽ5hjnuPʣiv]nu&V'OV>!¤QL@rVޠ9#IF$)0̠mz)a`LLlo7u >| މ8&W`\j]_{EpRdWM]i?F%hjK  ك=Jn5 4g0Tw )%. 5LKrhQQBȕVe*WR$@q](s[ >oӿ՟4Uuɯ_JY7qu?5FV7ބp6Iup> WoQh{ͭjU[ƃI]IG#׏^/ĩ9dR#EZ9jGcצNW&qzg9k< }>-m?yJ3ŪSaIj$tE|n\]Ԩe{g_zYPvMJiZh[5H˰ا aʂu5곑Zq;"A$5LNIGýZP>[^\g޹j,w.ILHB*y!" FM""LicH@\d rvQ(gv/n5el.Vo5RWx.1@zQ19؀˿Xv'w~p_Ub+v2db;#RKF!E$x`.XUh;"ϴ+׌k|-hErK4 \&'ud,34e!rC Q-? g=7dS 帰I‘/9=Ay6;_y85_vP߾W8ηw/4EM,GSLb "#AY$hA !C9apVY9Db54]ab ؤn\u}ҥN) Ȯ(Cl@&T QMѤÕⱤ* j3ÐjL*_ZޥpBt uH.1R;fBLDȃN;:GtbhR/F%Z(:i\9%  S1T 4p ՞$,Tγk,Z0eZ>PBZA=N#(*-l @ ņެ#7&q`H}&;B$w-mI f~Tllvqr 0),HEU%R38lq_OW}]]]R@6H |΍fR&郍TfuHL(Em$p@[wxpV[pF2D&B#cIʦ&H4E)hlb+,P9sSXT'ިK3Ldm2,~@uF<@Bh|~6jqj#u>{#AIH ^`ZF59bI&9` Hr*k|X}uVުg1߽t5ZÕwО23?]9SXC98K , нvL,8 QJ5"Vv#֕nkYPg"䬇N㨘}y~b1(Ɨ\~j`u׺jK]ls*'hS4rg:xUT nV;9c楽9ȁWk`mfm`M=zJ&c6:O^2KTC(DZO>MZ.s2#yh}ά $pM$) yBGpnħpl! hͰ\{IgzÉgTF)D5&QG$(|T+d@88lpA' /N&ܝ}LHxfٵZh*yDjip_>{vm8KRf6!PO [HuG5l_=G%BCM7:k3]߮p8*Y:7z4oKkkDS}7W/W `h|~~,H1~V /Q,Bֵdږ@ff8UkYר!?o1٧,t3UFouu4`N:xkHnX|7WXSIh1CY#Tc)`u4[mCg:EKbLjQeig}z|9OӺŵ'[Xhk5>E_WF׹ؿu p}n[~b{ xݳYb}݀7sW޽ޏ{-uȽF3u++uFdҐ޽>@Z1A]e :CWj eD2J*{:@'t|~m2\ƺBW-жUF)DOWHW- "`CUg*坡aw(epTNW[LW{^PP9| =]=v)W\ vVUFIOWHW ]+{m2\b@mr%9MOWCWhW):DWpeg zgQJ[WIWJX %#k|F5_B9P*a;ev:CO]ſujd/ Lr qx*)zZ*iiHTVUwbi9(>z}9St>SJ jPQb%o`q $SZ)X)/*Af=_.q+v]nZS6S{lfi>9hթ0Yw3\]12ڧw72JC{SM5P+֌8qcǬxyUT/pRU^L(ioGu c}>z*IQ\6 h<⬳zB ">,إ_Lg׬w}5k.r4ˏ0x"~l"ٴA9ث,r<=_wË mv{NiCC3LY9zOB&(AsebI(1ңMgEDbAF oB)Ƚe8O?M#(A KH!)puX09NGi,YdNIJι+QeilH%#<P;K@Rҙ Ѯ,-d@!~ipv)9ݡ Wu2Z@Xҕ(]) t'$0+tjh;]!JJC+-hq֭˽]+Dhz,]+7\ f3a1jRkW]C"/̰` mvpakWۡU{rCiZv> +COY]h?]e WUF~(Wqtu8t)3CtT @W cmhOWHW\)PCt WUFI*+PFThّ.Z2]tQ60ZQ0R?ҵE5eKV%;CWhn;Mg+zOӇCB JD 32坡Vު(uoU"]IUCWW ]eUFٶZz=]= ])ÞYٕ2`E:CW ]!ZZo]em YJ!i XU3mb8H2F?Y +*hm;]emjP(쿰V7]mٺ-7PR؂DOWzF.JW]!\IYW*v(%骪;DW4 WCW JNW% y_ꩊS 3W#a@v8]B(-Ne Rsv8 R-8UyRUL5,AwTCj-m72~! M5ioq*&wpk2m휅z%;Vk8mnaV):Td+AZ]j++ {my5ԬD6(s<68])IbR[ ԤC+ YYpBF+Z QJޯ,ʂTkuQA+]Ek9xgKҢ ;6~}_v,6m󿏫O%DUY?yBuQqoh!Mʂhơǯ( ֤n~R=pe ކ^WgvT=Ck[JFĿ7__Q)oi>,ttR|NOQ3q8.} -fV;^\s|>-_]ҵaqU~ʋKD"Xh/0 %A' ykM'6DB-ME 1Axٻ6ndWXTy,rm޸wrѰHCIVRۘRćDAЩ t+=/_,nõ/ki R4 j45G(xK͐pg[cFN%zT^Y>ܠ >h`g׷}󿣏h$FnI[7Cbwn_~j\: ;1/߮|6x/sE%'nzC}Y\^ _py *,^_Va4lf6=K8փ qv$4w& ݪ~5~y@_Skܰ&q@qLD/{{MI+ .`:HWk'IlRY: f { /EvݐuT2?OALOT  ْk*3yGWb;ʻaYe%8@Yxy*(rr',k z7{XPߋZ'ĝW!(CUJ"DSɲ)[[Ba:ƈ@#ׁ4$ l*x'!(i, t"S !nl=idbo^Pc@TT@!1Z(OB ?=Sso~)Bڵ.n.ru- ]k>^XTd?YIǽj+++JjKx>-tu0nOC圣:1%TF$R=8" DjF!,)j XB H6 ˌ &cD3.ѝTyĄ†ZL݆ u5EZ ʣ8Bhr@sLxdﻓr&V߽9j2l 7o>tlfXewoGMynw\U{}ek?^>ST~S"J'DXMZK~) DF{ƹ'42RȨN1&I#R)C`hֳ;$R A42g72*Űf싅 ogf|^fMULW[_a677~[rr8g_9b[ IO"Lrht>xhBSi"b*$"#{YMtP+5Q'OU)@BZe #v1qv#⒋y(]L;DmJʢvW vS+ a.RuD 2dd4' ō%i4CddE0IGR22C*Ћ4bMR qD $# Tv0g76NA c_D4HF/=Tkh 7γFwb$Y\e "5 ڤ4Q 2DE5zLh4!g FbFSu!uӒ}qq_ڼ%QTF |j=:MJI2¬HG\| \<<,!/|?<|[߯gp֌;g2C6Fu|5Ż/7Z#IwZ! hw~|Gefţ>i:qw6(Os ‡b25qȰ3IS|bD%L0ϣR 2 H%)|ỦfD\"0.&ޅj2jCv?~=oXA Ԩټf koER}}*OR'rEl0 $D):K1)"zH%(uS#0 +$%Q( t霉xX JCW4ˮaS-WWHcJfEbL!|yB* a Bybِwܥ52pT5؁%W}Dx^FgtQC^xXa&*hHpDe*Z5  9FpM xsD1Z꼢3J%t b ӒYA'tX4"#AL]fpcgX䉑[}B+ƞ%yŃTL&FWDh@P! !8ϔJneRV=~4 ƺJ-6]G .EJ{v[@,BK%@<pWvx#̾r2qhr:eiug?+ϣj^iajU^g *)~tomhd[{:D7:gMy>%Dy#f;fp6 ŵ3Pǿ}6=|s/7nFynû=F1z{Z^j[`]Q&n*( =QOUj鉗9hF7# ϧvṬ3C ^bN>ka"M;A>u9k0>㼆ޠՁz6L6j^z+hԝYiFuwEu6W>Ep_6"͵rL ͉#Hr[o~a~F;?wFq|?sbpGSB}5~8A9~-_"v\8Eh#U{:x(u/{&~M4ZG˄}-hl'>`wc5NޔvnU1mX7E?~s@o72M]}xgh }w6fYnK_  lajs,Bb.ns9Juy/+d:`_|dɮ+[wd.*2n՘Bݪi.]mye^Tqo);g!J`ghZn>r"s:>𴅎d-!@`$孬W dH aJ$>δ0+2h]q92X%?9b,2#`>@&je.*<4yXt^>T5'sC)x9U8So'3&#ZFy#}3Oƣjzz'r5 *kr ێ $9яc>QRD{jJ\2QٷӝhYoiЙX̆\ľe *;WB5sN&U,L-7J5%Q T68Kz$y|1wp`хy"ɗԩ&C7J 5"T:pj.gUR*!VؔV츫85#<{R~#ePʑPC<1nK2!D2m%E;199KFN$ճޒ`7]ɏzX'z:p:@\:2z!ŧsbX)4#t>uSh.x"pA)Ǜ`}y 峇[ M03Nm\o(NY f`,G-RYOpa,y#+NSpDh2Z#ZAI%T`=o |RO)8 +/&nw9cƺ/]K/ɂBcoԼh>|`KՈՓ'!]ΉLx<(.*Da>mχ#G@$1hTBK'^:)N`@G-f w1 >ىHY_xB>%b%* ep2rßRIt(sK7@q`V[&.]>o%EJtb%Td+hQp qK t%U)) .{\ݕ&lԝ1& m ki:bVUFau2rig.`. YP cCclB9x:(Cw|XvB_*k-A?/._Jlm8KZ X({9JenaQ!*$*%YЀAB XlauAjS:Ki'4<' ,26 !:0Z[ISԲl Ύhmӻzk|^`nܛO-ZCw(p!&]UVﷺ̢얢EOo2϶]An6TU׮ER/`ПktwEw/wW7Osͺ@;EoݽMo^"%ݹ?L' -n^o.}~2λyއuKܖyg^yE|<\W]s孻z6$ȹ {X xyȐBfvn ls!SlMWؾ)hzԐ1R+-^'$ Ɂ"HV$ؐJi+o W6J4 @9qZIE.!IShVgJY t,JAp5@2ϡP`0c!G;Uh/ŠM]<9=nx-Jt4ut<+Jָ 1p=\3"UbY#W(RJ{iYlR n@栌FC\S٢֔#ډ'G!i eR-py i:M4Wy=(~t +&sxdnH&C-T cb d*!dFľ$6Ɣ9dE,K\gg:yyrNKša ;3%K*1f@3a2,0(ĐLY$ẐzPP[=4b&5)xjDc5rd5]1'5?,OA<:8)V6׺ldSMmըYkLR*>LC)vF<*q̏h2ߌjP>ۼ(_Qw3ѧ?߿O~O\Oç_~f?Ik _ _/ |S?>{ՆU5շ̆Ff|[$֘&7~yXMۯ?ghlm3">/8+|?|6 "jT44bM!V|,4z'uMP,{6Ϩ>FF28bx"Ҿ(FR|AkI?{pFzņ%u/:YSEmKMQѣc"dSg,$ᰧʞN2'L|>7zC ex?z^P޴3TI\RAu ɇ/*~8 X˜guL/,hi82!-3Y>:xH1dzNVuҫ<0:: J& gD':8rj~t3=&7\E:9@H v.8?v/}8eyIY Q啒B& UEg<' ?A zEq,2`BL 4tɤnJTAprٲQ̭*GQk%mPD>ѠTBgS"9+saʱ58;O t{@‘fo]7v2[ЧCGS9߳v-UR}B`H].YG1u0ALf$f. Whex9PޡNȹFtdr6!gK>2Ȳ 1Btm'^m q HWz;N?mS̏7ey (eeMCfzw\ય*~-ʸ+eV^U^ Wo-f7\緁yTbyl<9b*na&,S=o+w޺4v_p2jњ}n~kk rzb{=o0R(ڮqٛh}[Ђz݂!;$^JWUX_誠՝@W ]_>_1yp<,]؁j?c tzRf{DW0Ukx_誠tE(-]C>]`~pcp?BW-UAft~JCzGtE菺*p ]]R!]{L `zCW}+B+:]zte}-A=YtjdaqvXE^05bp_~%w@:xđAS@5Sfu.(abx4˱K#*?F6j]󪲠lwHW+"4zCW.W}_NW#]Y#ZCvD \ȮUAi@WJ+*{d \#zvUUA7U^~p8,]yj/u4{Е깯[\ 7h \T}+BBv JztUNJ]`tfoF=>+rsXB|0}p5yi~OL4<3 *,rС'Ł lƑ+h+PZҋ\\iC4!ȕa+ >/B+G]'X/Rr派{G}ts\}Ԑg?':Ï'WЁVӤ5uU@mi3?}imEvcGkkّȀ>]?; N!ӟ__4|W<;)9}w0~Fy^!Mo?;nv[}ir[?{o~k)Csƪ0P&{iwDh`BiyԂa23 !XˊAdnK.3TS7l3]Dskb ܺН%Psk(e3M5i8.` ̷-4-[ ؄¹@S yƹ\Rb#zrn[,S O27+GW3*mQ.l>@:ǷWhe:]]]_X6#αſgm1[o}t[ήq`t4!􉃳j`>rW|Y7ؽ[|kG)W>iQv֟oQu;*lrF}) ;wݲi8Y >kH| A0Bw55ۿI;zAj~\~cWy-i=d7rdL.GvFF8*vB0 ˙oN[$>}@{~2ShF՘= )cTsե ݫRyc(llڠԧ;aAqV Pc#=3٤SW|&˭91:DoHn,TrZ(\ #nь}'+S4)|n+-6 O5r~v5RmԻfmV)UwۀLddRyT> 20T=lKY83k֌"{YnC .\T rz'= M0f"'ͣni2U61k(Whl;A k֡tҔsPAa<>Mw!0*kR=^bC- ïUh]#EG+G |dF}.j.֠8a*s.̱f =O͹x܊@Uj5Ժs!DНH9iE%x/:|jazX"َMѵHIZ%GXG }5<$ H/*m(-V\j)jyͰX88dclsi=iGPDNb1Y\S>Em5Rܔ 3HT*.NB4F( ѥf ӘJ2]d4 fl"C)w >#($xW,*6dtԐ[@w 4 0㥎Ι=VsT 2QN[`eW+qA[ <*8ӆoue+y,qT $B2Y6XWϺWgnAM  Ap*vkѰ֞CY.JvDZEɷVN=R{!1]vt nQ\5,%V\@:(  nUdXAB TPk@ouijV2LB5)_p6LmFoZȄvEc |e،T J! 1ukv kHhH",YB;-Zʈ!3֜e t+,|:v`& !Gr a`.pc_JJ r|BQm* r#2!A"\+"!h2ݙR( pseĸj` wڳ<;JP!_(H~ ;匂kcՕޫU1j{YQS+%y¹gD0-^7F yC T]&FHK°:4]AՊXZ Iԙy`R/t^̰KUU 瓊1A UDB&vm:/雫viU8|Zptj5.(AZ$ >:%$AꐅCl#݆tdYLҕj@J2" ,bÒT/A[YX{̶0VF`;VG$_Jn)]e# d}ȅBOE, )V y҈G#!$r@#i"!%:]% _>1gmG])4(CXMNHnK)e$ٔf8P9^ HB׌lw,Kihň޲3FKgyBв;JAYd6S@2S*J&J+1!e@~"QCEUMENӰ2u sF̪+crjXdk>ZlO?Xҝ,INGFYIxS(m \+wr魪U7}+,RȈ;-wRªL$0@l3`HW)0dkޕq,/oݑ> sH0E*$X^ߪ^:s,Q!ΰ5.m &n8Hc[A4T3XQV1t05 @][pf0z`2oAmtF v=EKQ2 5$gy.9:Xh9o`Dlhhw 3|n0)x +`!u>fXJfh&=D[餄U.a+B #8 P#{PkЃ Ua$ V!ߺAV!*5W'4,n:@3 +0 S! %1<)U,JΈO&޼Yj‚e~ 2#aQ`=+NF1sJl,2̍LmFjxtT gI<(`ҧd:$ƀZ`5:s-t ʾԙQB{`5x`7 PX˙dB3zQpVB9?)pr">>xɠ8w{ 5lx,1 %]a$s+̷/A`p`~V;M:rIjKGa.Y0$fZKx$l MXE:.DUlM^~5 OW$hw*;c"DoaL0vꅁoGxY ĆA`r6JU1r]58Ց_~Y>z Z}%,TQoh;12x!hu IYt: hxݾ0i$X:PwZaB|vM} *|moI ގZ1 cS1pOVsj |X*YOgPlv2pa}ߨ +ΦXl8]/Lz~fQ!ۤ/Ai)ˬwhcѬׄ^i 5PPJ++#8 5~WW(wNP!p WЅ#pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD+%!\ •"\Dg?pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"գW&7%•ɼC7\<HkdW/pnpE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"%\9@iW ޡ@\#vp fp WЙvsE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pE+"\WD"pu/U 2eP9 sTJ^Bz8F0e!~ P\vZ0;⇽~إ\ w\q%bPZ \Cd^RHK \w\+oB)'sh5_sRx6W #@g7kz㧷-{9`t_>dˇ7՛gzs38q/qa,_}R4>)Rm3gBYnW{YR+kc2 | . sQfϘ-x# suVs!!5k;]P\3 J/. a!;d Q;cP\w\Nm2Xd^BݥyV&Θ+WLe{+߇J!MFE>J?'xw?|ӟD=ަ\mb;4C;2a'߆YrRqRZWG%^ZT}^p1?e9J l}LT|mj%.t!i,_SeojpkGԭPyh&8ZUzh>yJgw}\fA^PhN 3AUH8r}7ʸT@9{?,)Rі6ʺx0}F1:)ۂ 0MfI+$ւ52$]ײv,r6ُ?1j%ZkY װLeCB0+ิ3pc-tNWι:#O_z=UOj>Oհc`e9 '\n[kkkCYzByǰQ|ͻl/Ӓ'O.F46ݽaɠUT_0BaS)`|c/^1WīxU3Ƹ3Ջ9.Xn ZK5YV"0[a>*Z]W.kcA׃t6CU!$f1{y4x2T2Y>0B( v3x+ºuhn/ evqR(3qxP[6vAk'( U@Oǧyzk6?珫JO4e"a.ikS9,LP.V{QGJPsS74+O]pmDjzFaCd0m,ЉI9u,1SV҃d*q2Z'W;f^{;r_h|XC|o<֛o7gbHП-2~Pf_˝8!nR,B|i3 0XM00JooV >7^& ~^< < S{COUL=\ܒa;72t&x }5Vx߯<$JiM61ʵ_7C Ab/*> =]98nhl9*͵.Mm5^US/&~tiG6=i+`j}~ōE,q/D}$klPL /x2㺌%:? OH@uisЎ|lX=_q qys.XƁKMYӣȆZ0\8-bI"gYRNytəδNO ;:>Cg:qaqy0g1';[Ӆ98U;Yod?"7+j0߶8: ɯ$R5ttaP:|#vJZ9f (E gיGhd FY.r.3H6fBywfJ-+#nU-MYR̳`+'}y%.hL5LSsm#%ڻBԂgP-&L9zۦykz_?2O\@k-7(Þ? ~'aϛV~g^RZ+GZiIiSVjͅ:$Cɬ:!HjN%wΥZhᏰ2Ywv&.-XꂅӸ.O(FDP.'H]mo9+B>,psag{a68A[ݲd%Yl)n6f=|]&=P^h==J6'('G6Uw7z\O5\~8߶^{N}P\{ŸIτל)w]ca褨.A#"(P&ʊU41ԓ)ksnc4tۿq 8n'vsGr! N{|;T,wގ6:]˓O4y/#,c~$_7v6y+㰆sD1.|1Nf?uW=կ{\b9nˎSkZsmƭ߷G'[r2v"ͭ-rL6Ԑ{6ˬ6F=?-_o0&{=vf2{w.[Z?1AyV_VI=ؼ_ja:k*=:|GOD`ɸ]ݨi|ch f+.mL¾Β`gL8J8^oѣm6DLB>~9,13{GP;\ltwAn;Dlx.Ys\&!ˮߕhTh#:T 8r43:E8 828%HrGbp`nw`dvCA!Bۻ0/(:)ahߍu8E ʗZ^{۱_DO=z!eG[,^lXvT[-"MYdNG*sPE6fEϲMJNmɘD?-5EDZi="nK48ha=yց4@ݪ$$ìMQ` GEhJcp$eAqHg\@f£*y'U{pFbٍ7W_C\b#YlbωmJZK)%QNBHH"X\qK<SSbPe) h;USE ["ل UH.3R9fB%u$"@a T}6$xP[#V_u ze-8lGK%aA? S1T <N|Q"ѵ SQ)pU)㸡>PBZAjIG-l @ b#'3?-VpEosK4KOg>v|mO廬ݷ̇_wS>NP;,% RBxd6H R&郍TڲBLhJQ#" +|k9m-}s;WO5ݶ{BT)aE}˝r|OG'@o W߿ZF"߿hNC". W_hib8]q^xU|4WР fU:HYM> ] [>(4"O!¤QL@rŵƁ$#9L 0@dY!(0&g@N%eLT&$QWN_PRJ|Z9c' )y*(H[zw3Dg_v:oX{oOBfđE?J*QUA]0&Hx(qQr9ߦ{yJ[ISd2L40-mq2$8Lq1F' 9"˩).{uf9+6xد̌fwT{W]Nඑ[6%6UZl)sn+,]?.]-KosYW%+,_Wmݕ7{ԼVrxM&[knup7̳|G֠˅g[WC}辁We2qx|6Jjϭ_7F$5kTӑ xPRr?-DƉ*I+.ؕ{-ޏ=mȩyؑR{&1! )TK#6:7 )m"̺;pp, B19(.QBe( *jl'DLaXsgj:||B<[8_2yLk[?ΘgT$NssЈA h8; Ekw³)jTĤHHNC@Cb G$˩?kn#^&W7lkN;^+KpMV"~֦M{@{>K ]mo9+B>b̗,,p̧7챜d2WlKX"u݃ɨ)vT|ʑcWA <Q. H-& Uw>*l~ L9Q&nFJQ ;ЦhJdo vm[_Qg4]SPeW)9Od.)Ў |(K(ke-mh+?iBC`#bFB53/'U$߲jt!҈8];jW)N+) EHж "(PdY*,x6}*&ƱV( cm `  F2p19ȀqTq,HqmDTk5+m2UR;El0qvޮ|f+U. "躍1 b>k6'\&vCq&(3&QX]"(M^|5KdZ-㍄IχVg4RvE>rٚ j dy}4վl 3kAAIT$V3/1d씪U܌qHp cQ.ߟ `:Wë_bߏw*9Gz[ldb4"a" s5PaVچˈl/JփE/b|XoxlIr xcL`Bd܊+-b?PL8N1yX7كiȾmbMt'iMg~==h[QV&qGD7'[GC75t,tsMZ5)7('C7VVG䮚jn]5ijRtWֶGD`Ѹ&cqWMZcwWMJKz h}mqinZrן~9{%]0;S᥂ٿ)oϯ~]]^54%/`uh&5XYAd#Xi?Nkl9c0o%y~.q,{8;!k20"N^Q:x-Gff08ӱKʚ7+F/FuȭkwYzW JPZۓs塝/HK?Hِ%_wJ*b1$CC[\j5)c+F+RIyMJ3D;XQС,6O􏳳Cϯr=5kysWfvD~ ur)PH͠^¾>g7~ >/}<|3{uo={Ho֗:?ZgF-f3c|no /_l\-\d^cY* bq` Rklk;B1ۏGǁ#Ɖvj,}Ð(@2F7TDrH( <52eA12.DrU5q$3Ldd&L1VWNOҧyrDo%vXB6 %SbժB͡VȢ8W]ʡhbt(ElQe$oBl >,.LZ4SoІ&J} Ad)[JiѨ(:RS41-*0d:XrNg)6*WmC"\]4V4 (Rd卧B %ž1k ^{U] hL6F jGQ#*W[JF. %bT!qu*Yf&]JM!l Yljdcw$&!@JӁ:8)9X-ic2b]7R^Jvm~(?lv{?'|/-M-qxSn _F؝Nc8 \X܋}غ ?[j]-`iR;Mz HoRüq0څcTSmUEWpV(/ 0WN0UD"gtmWi jpց8k=զ9wUwr?~?o]ٮw7-&a]P~;fPoiۢ \EgOf\ M:=t6M.hyuຍ}{Fm;xd.YU8&@S#@)q4X}[w+lHhH`HdU#նjWmQIGo"ɏ!G x#`ܿ͊ZlҝwÑw{wJ#cMá' <AψC%_]C9B]Ѝ 0fK;L!rf9$Ğ仝ht'!NBNB VX3xj&% THR""e#)+G1PjĉD.bN!FmʙjYC`q#q;ڕo~Oŧ~t˶Imv{} lzkzcoxiqYuM{w}6Dfz5 ebrqUEsW!mqޖ~o^w[x.bO mҺ[/tn~z~}ҲR[wE׍|y}7Zޭggwzwnz̻ټ-t$bbQ߱uᯚnMw V<0lIn]c;(}3rb͝Bxx>¯=49W4P_' GFjN4P{zvP-+'i,gr|ndc0N"`ɉMyƔ)mF;hؒ{=olKᓍ+с4դDJ6C5bYJ<yn6L@&5x(Xm}%T`ڰ3<E|U1xCdӈK=|ۓnXھ_7'[s,Kk桢'"Q Rg1bY,WI> byP#d+RUֹްC0)j%'05X{F~#^o:60[x=I-н8j}҉%ۚH )06-h7x*H!7"tօ:ŤGe >DI#'mm(sKhSK@4z277䨂Ƀ b^IMAaÒQR,)>G04P܏?3Ad~9"e VFĹt~6)~Q21锹*SBQ(dM6a1 T)*x=Y`x<> 셭M_qU18jg-Й IkT(vT @cEmliDTVK!\z٪b*iYYtƘ-$‚1haBTS.ޡ]DL`HXw.X&/ ]h>% 2F$WK)"lMSZtL ^I@`O.SaK wIJ;o4 a)xdbl\?spy!y{1`#ޟ?Uot&(xzOe~yoyJ nz M9ɪ?8WA6;З9ي.a@ #㐽1;0u{(©SP{a^(*:mq R ˥CR&b!?~rfxtPNy22>@pMFg{VW^oU_ʩI,q s7) 6_x4&ǰ|F7#Ld+oTΪi'~-]T;o/g«e`vY& sj>q?:zE.9Gh,)+| Ц<k[b|}K!Xnlfy)pT'ԋi MG6/G *AWڼV7L/Z-ҦchfHjq4qYQ̒{8URR+*0&*ۙ돏a8W?y~?xsCL_V@LEao;N0z.5ӫ_o޴m5 5Mp+?y[o]wuJ[x ^> /Oi|-'ٯO/~ysfܠM‹5ڪiV70O~R/dܩ/U"i}YӞNVĭl$|P18ide)WC*O=b;!( hGcolv64*oh aR :=/QQ{cݞN+{:;WژJy ^:U]Dj#Z i>ʜh⅜+[Mc qګQQЯBQ04\`D$48ィ 87ԑH ƀcIZ0az= [!QRڶtd1|tMҹr^P(mΉ(|И '? גr aUGi/.D3ߙKR(.(`&o+Fo4wXMi4lH\rOzgŮX9kmg<}CsRڗ$=۷`D,rBxb$5&P$ ;$tVLǨ1(nN >&+IE\Hi=j%F[4 7MF9DŽRn&x9#XqGR&k rX+"֤mؚ8cDJ8u{ w'F^=&Pr°^b>H!&TRgc_Ȟ^6 h-s8s>ŹvUik: Ms`Sn>Q7GdEv֟ga׳̟ɨ́sM] oѦ(y>>h\cs2rcaZH<\Q%Lyd]:C(2p3?}{86>1>闉0ȏͱ!i/E&*>ܖ\27 M9EBT9yňDfH=oN#܋FHcƝb#( Q 1GL耂"Ɍi1*Wnڡ5DWVT,E h?uKh_KU֕ "(@;nh4'\ |d^y@/4oK:@&87%7]>(~n@_3S4/}lRVi?%JĠj x %y~p~.-߲"(&@L$6nΦ|uV qr@y.Jpf1)ŸU+oƽ:C$DR|}=wީ>UFqyD{0oF}3I jbLM݇kj zRn$ȼX]41_\)ݼ.vShaVR2Xp7B:I琮 ^`6|%}hڽ_ۤ<bA5~?/e85u>MȨ5b3@չaDVQxh:r/(% }PyN|aa owݓsm;y-}w<{}:}͊˘<6IM5?}C9{jYށ5PVKˢ%YKKOE|R013V4KeXjL.vΌ{!\9tݐ*`,S}C`߳Or>|ڡذ2Ok[gܰ!Au<5A[טޢ&(H>ׁ9[7XVqӮ}hݹu/SvTnGWqMf(u2AF{O5(hQ΂+"(D k^vVvX % %m)o@v ʉA[KM~u\ÁyeO/UQ*l,a_o@Y #MB559&: FN=&Ht4CQy@ҟ+|$ҟenMx~*e $Гȁvc{Eq²?3csӧ|bS[>֦=ZMAT+\8ZYKa2)w &pKjnDh:vF6zMć󎏠B ̮<.+F-!t1|=X۝04ʵ &uD9`؈\7;˜쬙$elZWmj{ x hz (y*BMqy-ؐ Fi qfS(CAeOC'ZWpt6ܰdrt[?n_||-mU$DK ֺvHqŊJv_j72dg,I@n0HxB +Dw1\Uv&w}]o_wz]Gx] A90_|uJAӄMɲV2"AI29$Rܲ5!Ӗ3<(#wYx$\H2kL*ʲdD"#BLqùzT`&( %34EK2xҎڑd!3$+: kbbAI "Q1A85gwǦx@G "$HSI^<(b3@d8V8\$K)UEDk@D6+ "jL0@dB($Q +#b5q#yYn]묦%"?_y$TFN QHZOFRRF*&Erѱ|ŗ]jڱ)x(65 &I 5rn^qB'8Qe(ݛE\i%GV]Q4+쁚Y0{WE`%Z \.5vpHD5•dml%\ 0R}5?ܴfH~MLr &O8aF?_~~pZ7>wd$+;A}dk7o-Q{R#fo`k4I \:L)`+ilJi ?pU|_|"\}pzA&gϮ+&oa{*RZyQ4D|o/ UR\}pe ʰ6s醣]rmqA*nq));ku# bL6 mV?Gzn53f!M v] >y΂7-( /5=OLg($|[H32Rt_;KAƝM2uiXZ$K0\ͅk3ȹ٨38*fcAKbSϿ{^ӇM'4kwiSvi~ۦ[ &dƣhcAZDb&%XnԻhC^6=ߝdDɲ:3V1ԊD\]Qà@RqX$uR7I\'zXȱnr|p5s5u ol@3^TL\rŕByb~9L.9la5,tCCK fH+gnnNS\X \1,whљ?)N+X GsLK )i}IabYY@˕PB٭IFK15AzW޲M7y0}̳PMZP1h@X≚fDr4j}ɝU(RܳOkjd5o=YVTZfdB;絷J&jcdcbLHAE ei /A_kizD2Q&0cBM%&gϲ(TF%O+OΤ4 $qa"՛H3 Y1-eG06,0Da !P,,fNV4q) RKҟO:LoyG+E' Qm&zwf I~ꜻ~:rWvDH{ FtUn ?t&yS?qo2*ۤxLGLh.|h4&8J=L5%W9X.0w=;=9?^ۤI]KoU-YX-]jlF^̢DS6'ҋe{7ݶٽy91Y*g:{UzWe޶:*i6CJÒa/}5JA8# &!ژGH'~<۟\9N{+OSޟz`B)}B"g+Pl!ǧ7Vެi2дU5z%ʶ^yE0_p $ćΏWǿwG E6%v$[W~WHlsO0}D+?vEl!b4Ǒ=ODcUӣGwijl$xTSrHZik&2|Pk"&ێ{1Ed :i6ko6 igϗ]pT̘2!ED"уG6sR2HS X@#Lʙ(.w%uJ5(uS5:6=!啙Ct VIk;yJOko"Uk-?Dy^1\=r|Fߚ!64RsZɄil|2 ea@Z?Tyܦu;Mp7_\*#)`Fd8 6ZI!֭REXf}@ڡ…L[n mnlWNcx\:a7iLN9ZrʵujXzx033g#QR֯}9̈́ ޒ{GG]=-H"EhPXK(D0)!Szuf;wQPFgोA d̂D-uȜ)NJJd"EmfUȽJ0FnEV WTa[ LfVfyAлА!fr_B<|e3 ]Cw-=j|>g6" +E(g̒%#x ^)&Y;Q"!F#7Pu U(k @gg',MNڴOd:T$e ZY??<ʣc~mos96Ťj`Ye)geBq3qc\㵪`Bz}.˹AlCcQ-B&$6!ẅ́ܨ^ȄH&d8J,3)'"qJIT +1 RrJnBJ{Y)uJ@mvF0&I,E GƚBF2u#:$!SM9t`D]8O8|QNxTs[cYVٽ6UR4+i]39Y$1F|UD OO/ $6e1L5 W@bn"o&NSǘ"YYx o툾刀fuKK,Tu,BZ4܂Z .5JteA{b<;xeVML0q8إ\N[?-g;t)8$},-oM~pkM`^)'3MS1kO&B)[d1o E-Kq THZwU&"s6r_eurb{- ^̾]w֧.ܺfq9Z^#zFtV5u966es~֦9M2j_k']Ho}X9{, GQ{=}63vy˂g tǝ~}ꔼbP<;c,״" u:wMܢ }檡/uE rXvKHmT)~;4rջ ;7 ˢް\'㶆l-,wފiIa޾:۳Fo !#9hdA e&^ǐ͖$CƧĹeu; ӖhdeABBODsAvAP[kfBY?Qٰ'+: a96XZjzQd<~[DÛ6DoIa8o|ښ'c0̇_'ƎzRWV1@ $L)$Bx턗n:et1{όtp7MIytĝrFSTĸj#c5q#c=R iƦXhXWX800?L棛Q/ ׽G!2b[F̌HOcIO' 0'IF_-CR>qP,2Ք({ ؒgzzjfUw]Ccļt!h"SiP f떛FdP= A@QB+lԙ|;'&25v5rvkl?5Xv5SksVWkw Z{@XYV ȵ֑ZdT)0Z0Q $4ӥy_,vtɑ9!p[T7egku>MW uEtn0EYRǀLT$)bf1TklZdH) lnx(lqճQ 瓃э*hMqxd>ӗN>JA*X!!Nz- yP{}I5H=ja} zZlٻ]D}(ٝ?Jx/ p :KW'6&# r(Q)g -.$s4 ̠c‹A߫KtLؼ,;+[ղw1i+&FA(nr݄~9Y r^UUڛi7ݟwCPMڢ!%笿)E|9OE3_L/.ZPkB+:<``IY +@nBO9#jUw m*_Z(WbGUDɵ*˶"uiVP_G?wFqRvC/eI~?>/Q~Iں__)%YLg+91Gs~fX/_u_-8l|>?zu -yjH!ƺdrEt1 E:Kɹޚd޼x{6+Wv_gZ]臋d틔S-E6b/[m6-Ex_Bi~\WK:V/T\Ow~6~z5Z~5f)X͵FkEd_w~fˋC=|e;`EVo ɂ& 7[=4qޅ4gR"Qw"˽-n#,?|)#fn~tz%QCGBEק`s yssajm$A3'yz`:v=x E6z4l211{Ҟ.fg:mp7 <0I]>v Ff9hLv3T3_Y(ؽr~a;GNEeBDua]huvF&/VZ)kD Bp} T2d oְ*a}2sH6*|IHpF!Ch,Rs(x*c2ҫ Ezp-$jSo!Qz7zҐHyF0jt>#t>4 ؚkkPLPd)"␉}LlGrbo!R)4Eit#Ig?_QNq\\>Wtyq _R]YpsrAuD<@ #t^3.%Xo$Osn H.5hʅ_ziËɏ),/Pn,Cg/ѽ-||?_fzgtgETD\}}/j?07P~T۝%P ]xN"gFʷ{g F,p-d}Hz$Eҹ$zo4Xt/yZ0έPs^-Wl-h9IwǶ,tCgG˪tr6/ `իFY:*G4zk"_G輪 ٨{|zkېUdŢXyƤӬDVXu)X1XՀ1A|㝇Rƞ\ vXuee"r{# ׆:q59Q0@nʤ%RSjUZa1>"G;IߎNB(D :P:ǐFqQ˙ (sprPUQ'{'}4?ލԧW=Lt!vmjV(nd(= ")-3Gi"B]RSCϏEȱ_B|pxiBLʅLU15xCU&eN-J!\>~zccjta>Ey178ѐm:*ګҍA)-((}@lH&F&71z/6"1Km/ll40##IG"($E!)]SJ@i΍)q*F1Q[c^Z#YtkZn Tũe!Хxuz!rF{2IuzV:mA$>Eqj5rv>zٞ{{G[՟0˒=pVAy1L2ui?[JeK\”BZs"5}5؀gsII>p%YK,ш` :Zɿ%~ SC%)6kJV$j_X2i*$6>pɼq.dLO0:;-$ovʝt\T-W-c40Xc1$6d+Ibl1f tJ:. h-AƎHf)L=}g6e1hX|@0OLB.}P+ձw(T5 O|>'C`'{3!p6e68WðQKo2$hPX- ƀ:9zO? yVqKi}:kf!Ѷƒ<{1>lIa,m⛁dwع, YmD<'_bٱ,jrL[bSdJ@b(_)zOF8R0g o)0Z;Emno'ypeIbHyP I f|_ۀ%8߽l>0_6DT}zm _p*rqwR5_r>P[jXke4LԈ(CtAO`[Co' B8? _;y\*BjdVƑ EU_3pܣ‰y#C*F H9DFm5I]͖IH\s+4폻C:apFN If.Q Da@iaeI`x@/2B)aӪWfO_aV+8e, XGThx"dtq~{(垞M}sQȺP6h/^Z,;*ZWa`C)ElRϢYi!PB2Z! DQTD TȖV52DzHSJdJOm{aErB`HQyFH "yB=O杇Ő>Tj<_G?UYس&82#5=ܸ3czˊ²7q˧XU_ul`;q#O_Ѩ3Zw C]/W׉hvJtT'7՛][ƣ_/fmkVիO(Ħ3{V@JzՈ~>0-5fG|Z W h*1-Jכohy5 Zq]݀pd3;hڭΒZO~$f GZG 15IaB:#1&?1]#=b p.Q@8VtŴ "eIeIODB&dOguI| :J[xV6BK47?h]xE/eΈQ&fhPWH db߻=xv28ܹbSQ D2ʂR9)!D A@TuFKV)Ozt\,vtqGvvQZ:ʏfrۥ[F)GkBgeؚavټ6Ǟ=; 9;: 鄭C3"I7>_G}44gf5LZn}Ϣzz {]ۻ@*^: ލ9BԁID]DhME #hZ3[:]*NTaQAJ HnYPBPTG$B(w;+ǓRs>Oq4܇ӏ !vlIBo{C׹p1;߳yO/۾/ꦆr٩C4E!)KؠNZ1S%:֣`p\YqIX+fG5b)Iޙ8g<;ǓxV6mW Z1/+zS>v֢T% î  ÌdT阀v&=6渼GOy?7"R.88}ڼ4\j=il4va%6f;Nbz<|xATY`b2VqgGT=jT ɻp6ipsћ6v_ePOz^ O1ϴ#{׏7mGvwni_[~vr/0Mf7wPe['{Wu^Z_۞i=>B]Ey_M&NWͬ|tBUYEQ4oBɖ7Y֏wޅmm^m%G H+7d^dh1|tga&\\򤺉1.U{ן&ƞ=FiZսEނs^ ,8Z)(+!RQIgv[8[J4QQ zr<%};f2`T̸Fَqΰ3x( _$}]Tfay~ݞxyo0/O  ffLB¬]J²E!eO.\$Hm TA~ $Q*YaS Fd'[طc<)}Foj *1bw&vĎڝiQDx^=ݔ~뺂d S5T餴JF9L%rek5IvE 23C@ʖHFfR0q̵XJ=Nu[~Uq_3x("BLj="xgYsI^Ske2LػHQ dʞ9RAށ`g |1YL$f&M6`Uv)'-Ѫ`sCbrZggZP\=.x{%cH9$|g&c&.Uh1$cagPoDy:gܺ"_r6bn{\LяƲy#tp߰:'_P"1dlSrR \t2}>&9ϙ}B)*TBK`pAzFo4F|tDV :QOvIkJ鐔D:ֲ K $E9+T,rglΌ5NҜWxGiiSS<53 |*uS硘VHgB5k1gSMr F՜Ȉ_nMn}J6ޑM!7Ncl~,AZ1Y1CIPـ5!fRbX$'^JU@+-Fi^Ώi;Δ|NDҐGVߚ[o})[Un鶪"heA?%Hkʖ,VL!FmGGA 5g~0Y`Hgmzlؑr>vE=܎ >j::?{Ʊd O%X䵛`s N觥&z8CR2_&)uKLOOLW:{mMSZpDxjnyaPS2b" TpFTK _Dd979Un p< 5=<8c҅:EcB V:.juL)꘧LzQ5zZ]G!mB u]b dGx #iixdVw\`Є%ϴ2 ٰ?s(rTچP`x9L|5 ~5i/&U$_R u#,tsjI'w5&hmN.E`My7s!' su b,[?d(9%تi-i&*#jQU@55KBXUB_{s0%{WC &wrTįmϾ_pѾ 5>8K][w_oG&8j+Yx6y/.q<?|G| ゙\OYmLl8~ ȭ^~a4\,}~dy7~ښMڃ_lHs(S׽WoF^~snC$\q3,u?N.n |=y_KeGūͮ}c]GKZuX _M6MwɂlMX 77{T=9VÔЈLz͝._|*taiu1%&qZDWc)$Ņ'o8?]O*yGæ O~:׳żY㐿@U~^mU&M^Nj^6աxY۩ ՠ_ף GhkX $J`#6xݹjjUx=jg^0qҩ']mScS~ݧ O-oxCoKvI vq8q:E  cTpi"M.)W&t1W[3'>nz ;XhˊˑV\"t´ex5;lKW?1~1Y*r{Ƭpc)"; ɺ/z$@%Kstp~$ 0&E`+^N g1nK=ފ·H[H)8 \gl-=h$XϑC6"]*QWZ29=*a +0pJN唠dI) Q6evh9iѡU :Q3Y-ԙ3<*g(Ĉօrud6M Ҏ`gڬI<6Ny)>3"l-]]wrhYjd37Q ,xSPX[Wm~в.w~qumK-7p4Ep1?z6oY]d8ponM7w~3gjuyU;~QHyBY{n7?os4ըoB,RBwxE.cUR$VBrY9UFIYaM2"qƵ'=Ǧ/>e SU=~cԚՊH8e)"4wPh8ӢGpaiȎ̈v+4TB%L<!,(>)Js*َ? =/$s\Z*k1W6n{BK=/]~fI{>ذœOmëIq.#aS q"'Ay$I4B`ĵtI+(X$ТiQvEϞ ObR" q(@P%>r \*fTI9ʋN̒df) rFh 0,I#I&Fmrh",3qvԳZ|j0P3CgХ8D8q D@8F$ "*hPA%\Rx(1;#5 td8^ Aa&6:PN uj^VgPO'4?**EI 2G#je(6h0EV#T\щ8\ɾ}ROoG;;N9%FĔYHE F$Cb<*-mnEe*㇞u6DNEѕ +cYf}ĭ2B)Brnv-ݩsNA;:!jX;c\'lRMM'_E{]6(9 H(& 61Hܢkc9QH BT 咴TCpj R9XG I1'"O.$EPH,!)Q΋1j%Gxn58뭖'HZ{%ewS=I,"O4׹xXPJqI=NG2 h83q6zFϽh]<1cЊ>uJ@ܦ`u 8>aJ%働Tnu(4LjMm$p xᬱULFΡf95k+qvG)18FpKjA]Lpoc.08EN5Z4&|iy6:bkG,NiGb6Spz:\b".b,~@u *iI^|2qjL^PNGDfx7kMsENHLk8lB:6,#-R)>#=Xrz6z("\J:J,z@0&>Do8#\m ^T:À%UEV NBZì}(䡸L6Kq}xҺ#A΁-Rg-7RO DB-ME-@PɨQ"θ;e e-f?zp FqЛb ߣxݼ_rmnɺO?6 k"D9F>=؄|@-biq*-<@ G腦c2NkU3YH^HބMJ(TJxbI`NGS0TkNpJh &z.}+Cb8N#wD)Q)mUS(88k{k}usgPxJ1SAzk[Fy;#if?+̹,gi9~`+f_32]D#;8\IZ!q՝T̕R\N*E\3UW҂:usR \=Fs  p8csW~Q/>L4KCW'Nu'dHA.?_׻O/U-ſE.deBVV YY!++deBVVRBVVʊgBVV YY!++deBVV YYѶBVV YY!++de`]M8'PN+&.+&.KO>̔GRǤV bV,܊[p+­X bV,o[p+n­Xtp+nG{JBQ YY!++deH2RZ *deBVV YzpЇz4Ie^-*f %rֱ$'N@uuKq{s{ɺ $@%Ks2hI} 80tpSPp))\Z@N' UI$tgcLgp14}3:zdܧ t3;g ˳>&hڊ C'}}(H#;>xW̡28ue)p*CP4WsbNej̈́0' aN"&$PF4h9iIFRv1kEazȑ_6k}fwyكfj %%;q翟bbvKM۲$i+.:%x6%B@ނ xf1Zs$K;#r3ON;^FAIsLK6g P2sHJ$aBE ;ylnU \+Ru-1 "hNdc  9RP B ! <"qgr:$`l*LбRE11&1bc,V/2/1&J nr1 MIK˳g@Ae9<<9ӤRW4 K*m̊Ihg08mX`*9)B2@H9 ߟKg"bϗ:=w!׻lgf-_xL/ W5=L ͘޳]Ʌx\y7?~qw9 ;9fDE(?BʮϴhG`([2l z9h_M̏34FͫK*/u`ڤ{-vō0|jw>'JY@zON׃sHۢ}m^~ZлwdzTR gQا ֍ugW7vu┖tF;B>ظEyZ?W~n/.N #DÕ0gpGÓE{d6GY7&юς,t$#u ;AНìO_IecOZ|'XbY'AOo g[n.okԮ{UV؛Qe>-5 ,iurJח4Ac8dw:lTќyg Oi:c_?7>OV?PS:D>Z>JCk54̆='u׋k\lkH_;ƽ>nԧ5fa ˯/<вA{QZnLpA\ Mj~6y<¢V~Jb !^ +fK&>WTj ﶑icn#iH14Eʗ}\4vCBAP gImlXB]WcƄsQH) 2zQ͜ 7xl#IFp?өrsP͉7R!';o#;MmҪmv$* 쫩$TEx-O]W񕄊~JBD3O45Clh<$Dn3', H#ʃ+z< YeA#,hd*HƊs BYj:jUͺ'K6DP(\J"[bDϜ`#&.aڐMI◳BIv.1BƜDV?? :,nHҭQf*#dJ"EhZK(Da\P(Sƞ ?Kb x,H3iPIBZaQYmRM\ bWT &2q.3[9Vgσ BC뷐C#ëwolIB{B\w9;Y~NM~XcP`) :R3JM2Y;JDs7@«$uU(k $+3tdnrr5 UGJERڅWk[Yo@K ѱjM1+z,KDu˲fwq}ܾ"~QLH/FamHzxc I? MHfB>= Yꛐ2@,3)8$J@$1"S)9%ѢR^AVi]͎#2DXe$S7 'xTSN8E0RT+jq0@UPQHksˁnv1Lp))ƪ/RN{[.VNI`#$/W-c?K9MY $po Sa㑻F#WM =w'W#V*dw=7ߞqr2;fD!ek\p,<}LKe4Ad[z孢 vH@ݴSO&; kn{o{eGdZͺ߄'i|H^W o.榀1p!toܼϛi[sz ՞]՞;(\0^^[h7ŶgZ/Ĵ)s{fEČ zR(;,NQB qHFjFz\VBc*ճŒ;,;Eޠqhnόؖ3#BH$&Lvi2>҅d Q4W\LX7~ fA{.rs6Q頄W"jndd.lV/U2bWg7b8fQX]^ףv`YXYv w֑ZdT)n8Ȭ %!RKjXIF^T8Y A0išXP'.y$TD#Q(A&ڣ*&19cq+rOfC+!> Vy;Bt֮;w_1kNպQ=v_׺cOɞJQ)tO ςMN&6"iXP#SxSg h&jRVA*3KpVIṳĨ# 23'CL-]ĺNRG.u),/ym$J8 cBkõ sXj΄B*n51mJf^~;EuҴIҴI>zB5U $ի@X}~*i; EJ ?`B!Buxr5nngv]R4 1ǔYl1mY?ml6֏ɰԟb2=u_3K"h>d`husn|摾1@0 P֭?WZiCXh|<-mO }y&Z.YVvyk9Y7SfwAj Ĥ,0gʢb6 ,D)C/'9zH%O\^Lz`x`LL.b&td81'2)Mv )L>G0Rຼҫ)1p[UGA-wȄ!]mqFEr͍qtW!aXKk/{.~SrY8MOkΎQݝ%h_HfȍI.p po.E?qkg6<ʼlfq4öـDYR%9-9%ӶdeΘmiH~gΙ<8s,V*%W 뫄owA:/]^j× !1xUZצﳞ/W(8Gx1y3l_wj]t{> ޼#tnaDE4 E  CL yru?zkrᝠMI9X(8ICcX$3D$O22cj0y|4;X msBk*ޤy٢Eiʍ^ݲeE2+z ^]d6_-#TySp8O.'egs ,WvMf4r"aV߼ՌVYa[|" o&=f-fI۶շeL6hkڮֹjƒ~*U)+UM뵌]c.5\YGkMבmL\-iW$e)۲hʤ!m,{3|˶Q7zinH[~60ALR){+`Vֺ`5 B ;Ok"ԄFs/ ;8xv866& _:hBє[ٹ[E/ vɺF3|.5֟ WoZ~RTiͰ{VJpW(W_pjrWW=ĕ!+q[u\J\WJ}ЎW opjBb WG\*o"͢Zk\ǕE^ 7LMӛT9 V] j$WqLઙJ̀cMO-y+j!\\F| U*:ઇbVjp@[BV+PkqWW=RPg0ȭySNu\J5એRP5 䲮OIjV;߻ sWĕ@.k(\P1QĘ[Q(X3P_0jFB!5U+lW(ןA0Uu\Jc\Wa|z (ʕ\Z\|UqeWܧf X P2 Zq* WRW7,i׽frYǑ(mҵf쀫cMOy+ U+T˅B !BYpe(UڛՒ+TYap\gU;KH\Z\C摒2 6\\N]Z| UaTCfl `qX_O$Ϸq챒THb[N9{-{]7qYM*\mN:#3#b: Μ$<Ŝ筳ϏZ27-;wM-`& EZ2j9O-JnN-(a;P ̄\E}5u\ʨW J rP=3u\JCH0O+7BƛEǠVu䲙J: +#\Pwj9q~Kd gzAFDq)h׸j&X5kdh|3ՠI.Jڤpup&k<UH Pn[My:PeЛWD @6\\C/Bmoұ W\sk|]q- J PmGi4+&#\`O vZ]tlՀ$FUڝہt@J3,RC;Pa}´ă7_0jFrt1 g{+\ j:P Zq*Ee䀫J[@5W(xFr+T耫X)W XrP j:P1z+k#\`ȵ̛V: Uj>4=QBwH0]o&n65SɈcci\WǚW prWS:P%7z+N>@|E\Z| TbJPͻPwrf0j%wWұ\= $BjWҐ2=vdR` Cc*˵/4ʘ&>ޤa`G|qWf^ jf.K+o]9+Vcl~a -+w>9sN99II{gƣl E2j\Z@bZԂb\yknU\Zk\4 !4SZspGB]m8U=sqeʧ VxB7ATUf*pC\Y `ۮk(WxӻBy\J\=\g#qF):Q3* 4-\b5= PpFj-! UR5ઇbV_ +lȕ+TKB !.xL&  @qWW}ĕA[q;0Ln4Sk J; {W7Lo յ4rrvRLl< >OS1h! kEN> ?N|4e Y{sMߋTYşφ{['? Ciߗ0tۧ#NIivdqRCV sh%8.ƃ.E]RlϊQ2Y=tVɆ31Qx>ܪһ[쫬ݮ++.q)MnRڼƞVuwE_e/빥Ce(J^HT]0[#@=tqy \xDOE8[{zo h3=^F_1%(VYf͓EIH[}Tf&7I9&Ϩ2<$MYLUl$hM̕Jh|e'o&.%1NEn]f'+hG,QGC͗df_rEXфKsZ<֠[hHrjÕU,iD$b4H*:&eq j2NRk EHs΀I&u)KRr S*rMR%> PaS8dRRs EUBg9I!3Km58 G,ILuL (vP."aF6, <bbCm`% R!2 9yJ<,:R7khr]CR&i̸ҹ$-R $ScIL +3*ZZ=JtTI`Ɯ1EDY(gz㺑_ium~0 mo Q$CjOb,節,n&j+bnSxyH[(:TӨ%gC _{/DF0S"s|ߟ[Bu hQJAnЫ,h2 ѡL6(i cYlnc0j!ShwUa}pxĨt # A  4Js^*6YtT:DyKP[P)}z\yVUT/i>sH%5V%ރF%k ukaM nN';Yn)Ǟ\1c##$63f Ȩ*/-΄|`5 j}ITp,ֺ[7.m|i"%h,ڮFR-+wr26z=򰡥fnYB Q|n` :`՞Jb=w|֑]Zy;a:ir@,.MV.|"0`)`fR|WZ}YqDg()V %JJo*|,d./VA-#1ײq/YWeV\Zl莬Nc[ 3R#u˶ E@P壓%VٰQÜ0Tkt d{zCl#ZsA)5x X5lGD&0VN ddR!}EБ"x7NkYbԨTߕziZ26HQc9MF2Ymv]Eo{jPB]:=n +Z#q2F-[A( `-ePBdBE@i'rը |u@x`1g#~ nQ1uKQ `*u &NpPg"52D <l 34~ :jojLEw%RI9n,BU{ŔH( :j84Օ/H!H]ZwZ2+ R^|(`3XRF;ABPj]%CfMY"1A! db9WTU/,dPgF\B[^ܣ{@؋qJb6(!9Yh>3(RAUvNZ'$̿2s cgmzw[^淗|+P >a}Qu`mFL`-$ >:gAuPeP\J_7#JhIWo#f BX9Y4<#y䄰 _*Xq=Y0 36{ 6YS@ظ2}nH(jqHuutk昴58)knĘv+@h֫ʌ4IMG^AI'? 9k9lzBg 378wZuR"^zP낂0dj*g':RC}Cշnb1 V$5sJmhB $\a~G"o Q0 bp\ 1\U -Nu#-guk[sO*68&գ-19mp@୭EѲR'5֪Y u(P5jI:Lj='е-9V&Uנhdx57לAPK}4 ŢWsCLm] (BZu(AdJ bdzl??-Д nFЋ5>dNҫ֞fJOQ!,yJi]0d@b- Pilp1:jK2h\CC,Z*cQ4k&BER!6/SJ,vR@ǵ1[ Y\*y׹]Ot"BPQ>J+e?3 ՠLx7▛V0|4X-l -Em.s|||d~ϯ_+]lbm2xb9Qa_o^<6oxo>>b`-4avWW^~isߤ_/_.w_>?Foq>ūWڄ Eh~şgVxnVwIÏg/ET8|iusބ=6Wh㾭׻~ߓz#ÅC|UAA1L`gY3\7q&sN #N3t[qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 kq&'7p )@@ypG8 3r'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q茝@aq9L4h;q '8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qz&',y@DO`O tN ; $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@:'Ч[룍_Z7/^_VSn~{}?\oݵvw~So]`I$9q)6xNc\Z{W%1.q H %a:Gx!ByYd}&o!F g/˟67RkH=nQTGkL NAL71iެ,4hu:ufhiޅDtCW OcgI:]%i:C*5 8szjMQ:)ϑgG릡+QW6<]1䄮0terKw(AggVWǡϴvuJNteu5J'+4tpmTNW2Y3+); ]1\=b<ЀBWCWVDtwx]YS (BWgHW\ȑjb&BW~0tb ]!]nnc̅M:4%nljnwyoڇ._r~a5&|w^"@'d /kLPw/>>cenwY^}xO[|_[2œ³7AZyC|mwoOڕA{&m~mueis۷ai)姌L SM-eBWgHW6ƃ5 ]1Z{Q:CMDWiǡΑΘ%n 2hE1YG;i%~"iT%j{E0Fh in"4tp46){t(Ί: S;rfbϵ(^6p%]ńT&+4=^?b`=V trK 秫kWGu֮CI'vA;շVN|t0]1dg+~QZs+nG y+Fkթ$/tute&LDW.Y hCԞNW]#]&O3Avb@ɫ+FIF ic=jXs'=v{sPG}1˭,+O߷99UO?*LQ~Q > ^rF֔gL߽ox; sIffg^|XoGol9}gz6sSΨ؃zt= U2^<餽{7uGaQf^4MzB)ޣHU +'"uE R##%ͬm[bPnPp}:r⚝".ΐrE8h^&LI8FҎY | \KLR5l}frSyϧ)z o,Ixg0Dsb ,{OZAۖ{֗^ 4}_mŕSGtEL^|e0Qw,.0Sü=H,܃nkOf]k:$ ];i}MX}Vdv/zi8Ϋj\Ca"OgZ2z Wrߍ0*SGՠ\ V~]9}H-4N`"xf6Tk3O:zp1]c"4X|d%쎹rhq= 4`[($T#}( /C \Q&>LNW@ŴgG'j~*˓ύ{ǹPG"5a>xrҒ,O4H0VH4X%$LOLr8%b"XZYh6di]L$`L>^DqP[n%h>]y}o|]&*OHQs퀯͸_@PEI.>9|5ܛFX]uut_;<!+TaE` 2XN2f.PI'\PYm6fF{;$ ldS<`neR mg<}Ðx2FJKP98D%]7'%% sc`q.]%^^gpZT*[0|#3 qGvSrמ=iZ71f#ѻW罫aD4hgcn6@WO/wvTcGC]Rf]`r^gXrlXԚЈ L9XAGIpKY8g,[0 p'u@B_ jFfw?r(IFԃܖ]n' it 8Q#wl J1HtD E1H$¢N&$c$rH1CXSK;8mk;ؖѠn{E I߮1~ =Ppr_.o5IQT̃fvu(΋y'GD@7fܻ1eKHSGܤ8~{)uU^j4y3 hft <Їh]f) MrO Ć6ͷE6ĉEg>G(ppp1KDShנx|R|]MsG5eu0T7i/4?"޵ݞç&DV H`XsJ\iwzબjE^4,ro)3B#`S7dTh0rٟ?LYO٥QVIM}j]VEgjb wkJĨ6xΨjgǫ91;0o6`RU mg< T=Ye]_hkfvcz2YZiv>j<6﹎A C!fH |sVk/;^sXe/GGSʭ;3Ny (ws ڥ%2#P{|Ozakk*Mtn~^ G75.D j{ G~=L2g-Bk_l;M4!019É\z4bO }51-=Wa9̠OYT㇛ F!:LbAEVZR 553 bØfh_|)ﰧ! :MԞ g wuRI2"lq1x!Db SƌTFFL&Zƪ~o6rvܤه~?Kq<ؑEl-^ rK4XKC`$8XeC D -, Q;E;OȠ|0a!rroDell>QŤLB~c`"0sɱG`x_%MmjI[䱺::lY{MDzw("Ҝ"%VJ#5@V+i@"Z=QPextd>Z7()&=)8НbL9h\R19ۑ1 qƾX;,<*,QJ:S)^my[jPi Ar43PB~b! g@WP^)$S#>owhd0 3 j-$& fh/#v'0uV̈mGC`b jg=QסvK#Z92\/.`` :C25N0\[ٳ`ΐ64/ (WJц3+8jB9 dK ̻ NJEC <Ώ\{"…ezΓ\y#iiO+uͱ@ݟl࿱T׎|wf~rNf Fpj޴_|q/ʈo1ڝ7tzx˫7dž/yߪh%\ ~vS:ۏ_{g/oy8_18ICb'k=7"p5`Ag䣒ksiQM>*&JN'kcmB]as|'5FdrAˬ)FF&XüB::p Z"(E;?uNtl4Tނ- :"sV% dD&`ヵbrhJTG w?{iz cuw}@U DR$ `VK,ĞXbK.ഌ.,*QB VebJ"Z;I[%II^ԓ:d|:g3rn:<={\c S2(k7ToMΛeٳWH@e m ' 1F`) R3'56 lѡ@B{3f<6m=6zb{b;غ}gx7Y2KGQ sqVfGPKmB+)ottJ;R,N+( V!6hZCzTRBv vee PUo^}glыuhoCqwq-e\ZJ1Ӵk_z ,/gv֔^^?P5 b̪NYe¡̄M2e11xm2 `$wxy<ٮ._TaP d1`Hѵ^V=&FziPLPR{`ZTw!w59 Y$鲵$x( :-\q9[gL`Iu,gr Q71*_~ Yڪ-'"1?3RmS& zp $ 00ɒI E^A E~b7ђ 0CgɜO$d2 "d7 6eSjuau'oOƧ]$$"P9̾mk7>YGFRXHyoWDqq_:twr9}.YCQģ;XOM;㐻JX:k ͪapLioY}vƤHTeBlD&N:;O6YO/:yG))J!lbJ6J4AQj44RR\SRw))أRRG!s}dx-s5Oj #X"%5H;ũQJVZk RdE<1E;] 2$]Sm8lj xm$&Oơ4&X6i&fѺ6o`3夭'_\DI0¶B2|c/*,W-c4ϓ8X>ɘ,65!Tt!|A:mtlG*HAz$c'9֓\{ Y{ŧnmB[5wC0h>Q8XYjƳӓϻM_Mq)־ chauM5a@Q]t"У&`/ʣ`1SƫB`a!%%LޠcJ*EAb/) !$H2A*LBҌSa5#r2t錜B/ݠ,\I4[|3BL$IZYD8(tPDaa%6fJm xKjīfG3aYЪP.wWvY],B&H/lNMXgh5'tFV=x'MEXy8pERP`H'eiG"(x/bIn؀nnM8eLv7K#36:hhJQ"" (V΢M(R.ո=eRְ[T%jѲ.v#~5!{lV v?*:Fy՜{۰?d,Z(} U>\J(/jES3,i)U !Vq`'amWW]2'+)-E%V)JLlyJaZ`n.GV@hP"T@# +*zD&{ԊXC jH~QCȊDDE{mp&[g>?OGYEZ5 gVvaw`K욈raN7>>Iw)z" bv¶; Q8yC }ɓvըϬi;!>s,v,/Pqf(~ޙ <ɇ,-uFdJ> %6 Vm`PVϫԧFW6>)Vwxۻ />jA,;qxW2׎$rv#|Im]zݣ5ƿ!W޷:;:0dM9'p|q.Vv-82L'ߟU%>9x̷ZY7=!Mj?/%N>NOblG2Jcv`45Zn0Oc>x2<̎beͻS~jW|b'^(l ݛEV .ħufqep[Rg}5I_gvǣn\O[ lGi&hh<-U^] -ģԪ'ٌڍ c*W Us1y-|2]1lJe2@EʘeQ&(fLidOWU:R.;;/#WaՐsCڎy4KYݫ4Bk)6cֺ4k؄|ƪR!L/_]&WϽ kgc12hR9B^@̩u^w[=C^9ɑ,CQ![-b JFNQL" `B83rȰzj-:9u99|j#\_Oݲ&>I27/خM`89::#[E tt%w"8dPTF+"FF11ISYlp>'z}{cg Oш<&QQ[Ha$ Sڨ{,E'!mۮsC.MmˏO "|aoGfKz ]zlz=[5P#-SA0!H)S*+ISeT wȃO&!0@{ w;dEJ`Ȏgb*.&9K8tzZ6m=[zb9ʾh|*RZy|_s1{if'|^]˶lP4l~R|Ôӏpk';|G~`naz:")EK)BlZ9k %rV.қ/ ?n>ՌϬCnkI˾{|拼Z, Ía.koo-|ŕƓ::n/wNN& F~n4Wgs_jo^M_qYI;#ɊT㚿@U%Xg*<uj*zJU<#uKF]1sQWL-뻺b+TWU%g*bj}WWJPzJ *vK垍wvq+,2!,\%dłcadKHv{u]3%X0A -:B]Y2>T\ƺ||>WNۺytSa|5gʜ'9;l^rcWD+'oӧ(wPJ˫vx_7ŷ?$u|?ܷ+]Z0{< QKi{L5jD Ed3~ kOJFO՞@͏WW铉r_}<{W?;oNN7oꯩp,G& R>ßv=~y_>޿jzѯ,\*veW6bW+|Gu?#n[e;P`wt0w->?C7q0CpQ +"EWBkhJ hQWݩ Aat%z] -J(ݢcUQ 5Lu~]~99 vW+ȪJ?l4Fk]M3'/zjՓr@~] գJhuJ¢#ԕ&`g ]i qtF^'? F]fk%6ڄNVyV+cL^!b*WZŘz-?#La=ҍfK ʎriAho].-q ]ؼ,}Jp06J(ruRjɮ+M%sוPu ҫƹ ܠ,zJ^tu1Hq놙'a׮r*<é2 Ѓi6VWhz48]EWOzbHW&ѕEWsוPu^?+vJp+ q%)ue > +ci$] GF|] eXǨ+l +Fҕzizu_G+K&oopgdN3m;@PlT 撝І'Bb#sHW&+ÌPQ1;LHWjJpuEWBkgA(xt\4Rv`ƙn&ڌ+J(ݒ]E+0\7HvWȪ;Ctwk4Z{kW(o Zdt`RJp0Z+ fJcbHWfѕF5@k)]WBjjJ_`3}JpEWZk PWxUHWx~} j+ܚxtn}9ڞڝFvxfl)vqo'5\v~IM)Wz#X6-YgI6yX*y MNlR㍞vRB6h(IMӨm 隆k(@5҂P\Z8K 2jZ;7sFh5xSvGq`0\fкgJ]"F ؄q+FFѕgnyFѕUzN ip-i fʸ\;FMC|@莨at%n[ʣ=?n}U(cN )1Lt\lO(!t[x~>(仳EPEvPw(ϝ"7-]ǠuEoKDEPr~[n}|z]5esw&:o~.Yu֟R9uǧG|2::pp?W14͟\c|ܧqop ZF'w9I?!jjּy0>ӫnޱZe_>.,gtv|qӲ3p E sy|=* - wȞ}U|XL馭jrO _n$mY-i=4؆[MrdgTnlw!+ W!x;iIwl>~C{~8A~zt+wY~ytOT]BU]}0˖F:U]OP:Q5fO8 <*mjSUjt78wK婟d2 ݒ~Zhn\}" T c٤SW|&˭91NI1ҁ>i}k5D*@Z2TF$FŨw2R9EgvS\}R]ԻfmVSO\u dMF&ehqjJ&Zj{D,} :ƌjZ;B4#춸scrѽ׌ B%k@BD#gZt;L#MZ)ʕ< ۽2 IWDISU( ޅKCEUYjOlQ~B!(>:D!D쟁U'Jm^]>. >BXC1#3,s.̱f x_ا\|7w"B^Jj[KjJJVĠs\VT@YG۷bاK$qvSt-!vf8mI![Gǯ'kMB.ڈ>}* .֪/ glm6m'rHjTw¢=#~RL`U} TpY\S>Em5iR20f;DrUH]pzfIQB4zGhO06.5SGaƤU/3~V ]dr3OYBeEņr {kuԑ9vgAW5 2QN[eWT$ltFXBm+]ɻ`)^m N-54ˆbwYk.kio\dѝztJYް֞UPц*h]%`}kj'*Je 8K ) eƯE=nQ!\5"J+. c̆vÄsG2}BVQ7E45+YSp6Й%@ߌ޴X !2fj0 2dܐcab`H5H$4DmHcgrcEB|qkCxHAw¢#n mGؐ#o)ؗH r|( k2`=$`U$ co`әR(AhŨyg#lH;/ H Brƀkcף+{NuYc(^.$%RDS+%Y8g`-/O1RH_Μdpt Tv!MH "b #I ,v C}+b|B\FФ݉ ƙɴ߇-rrMwO9qrŬsDdɈ1AQcHjDMY;ۜL*onJ.kչcMVЍ-[ztMBKBㄷd9Y8ʳDV*=R\!TUFEbX!Opz xBgj{۸_|i"۠MCookH$'-p$ˊm9VښI'Ppu<4DH21%WD2F*Sf:2@ߓX&4_r/4<&`fm Mơ"3;US`y";*ƝfY @fd)F,G aH_V8//d<2byml6>D+Xk=tdI06hM7ZH_sFL Fnne{4^GuF(ʳVoMH=ǔxN;zl}oh;%n=nH!/1fW7R'Bl|(B{#48H,}2b;$ Dt BA)75FַY$úq5X!A|⯞o}5Gw.5\!O.k9m˻vzך(OV1\* !JZc8*dGdE՘CL T%X!aAĤЏe+r hNitRFzV@jX; \,5J>Z.ryf J@LHB5FYPȟ@2x׷l-9ރMju>@NZ TrID<Z#׃etjaLH-o5#WzQ iqc: ?ks|dw{ LH=G!,PBh℞Tt+ / smw>%*4]j]X1D̎Ь,?8 IHb]O h=F aטt^{nAz !֌;0^e!Q3Z|)`6 ER͢R*1u#w^\_2]*bB[ \|d sF@ҳO^-em>=%|]o[,Ҏ~8,+?p:ͫt|&K5Hy4.})_-.\Og(t6;ON0 yA42ZgO9fjqd\vfkbE5Ly{Աhk0޶\l%Kn]a쒿//p-u{/7n tN$Br@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 9.9@E\w 3/-h޿ǻT tI'#'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN rp@J" ]@eT8~C@@)krH̑@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 tN 4'>j;'j@ '!:+KN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'8n,|yi^RRvy}^]\XP;朦u8>Ͷ0,K%uaKmgK@kF<%!u5ZCtϲ5yZAly&:@r>zwtUU]{OW [=^3NWVj35\ CWt{ J]}ms+] ]q؃N[UA+}RI+aCtUuF]|骠DWHW1&dk ].QWNWeMtut; ;5+5 J47L\m|> KKv{oG)?1x6.kFz>JhZGx@ũOK^.Cݧa>YP޳yˢߗ1/3 ßޫϞ&Ԅ^"(nK{\woZz+O){22uϸURP)Te[Jߤ(2[ԆzuͲ]2SYTJT9U^gWvh܊?f}w>:^V]MUF9*L)L t2 |+礯T5d V-i7~kU+ zf(.tkAn-@J#}Xv19f'mǣ[|cM| Cs.QWoG Wy4G}عV.ow{Ї % ,J6>ѿzыz7]4?qֻQ[8Z EM4:(/D!e8<#6~Q-; ҡmyOG )$㸵R2ʓwY B%h/t۩VZ2޻[˘R7iAQckՔbq9w٥ve}>ڮ ]P;[iַ XuuYWZAb4ŚEc֪CtFU[w&,hUA(0kZ¸.%l M5 MPZJkQ?jUlD#%CN%uO; \BWcmBJM 8;Dt Zg]"]9Qv; ܺ3c ZɚqEt+}ϮLZh8~ne= 9dZpdv佟8MOS~NAeSy^L!2++s![6& %mS/f7}>Yz߼vufQv[}]P.jmFՎ;ik$0˦Vlʗ 9M2n Vokٳ2VMpS*ݫt?]e#yg غm=V~?IRO^ e}7@k?9<ov[ F<ȹ{ yu<`5w nMs]W Fpu7mHK a+;Clu=53RY0?qnKhw>v_. ٕI9/@~߁ ~ Np~?۳p\~qjc;˞οG(i)m>;O|_Z,50s9~ʏnJAf92i|Բ"H˫~Y"S- x7c޼_J{Pwmѵ'?[ow ?UOzY8(czꅮ!8 ":272dsf6Ullܤ,|aYp,ՀmF8]2Y{xt>ڹ/OJwD2+>Qofd02z_FnYx>A|}jv40Z2`4翑_ nqo'ű|XCv.1UXWgqsxlxYHC,6\Rn~7M٠YtZh8o]9ՏHwԼb%}y|+h'ZzM-J=apz;E?!ms|YfQddx ZʾK)E{/fg /=M9>=lZ ϓAsb\K*qjX"z1mM6\ ڧ|+f+^r[(-c꒴)4]aZ͚}d)J*''&uz1QkK+]Bmﱰ|([,dPѦȓ4T܄hYB%Yu.&4F47a<𬛜,󟋭-\FXf.!EJep Kə&qMZpOk4ŝ}ZQT^]+=9?D+t@z [[u U^wW,9EHfϾrJ),4&ZrŢ6gw&dʚi#xF>tfŌՁ+]E1AbHNぺ38_M^M K H'ȍ.kϲ_v[EUr#{w&kѻۘl)'g:*)z.o?, w䬽y %;eh^-ooO:O-lL`L9íu;AUɚCcʒ1!e`< k ؝2cmMR+/MԓVcH vjMN5cŸvݛ\Ljsl0i1HC*Q_"eSSuTwU&(d"Tnd&ndUaa68 uO g3X/R}1'ۻ-oK7nP`_NpX* AH3c.lA8e1cVĦFÐ= JlR!`Ve$"V̈M݈mGCb jgӎQסv`7h<< )b4Pʍ"T#XqGkZ Si_D#ɽKXA3lV\2BcE"d"!*!*c i+&z@.8BC BF;``@䜐<  PQS$S&J1(AH{m=l,YpKOC1E4@R( /N9 oH4 {.8ah`=w6}"қmVn}ov@$k,\-ly͆mkIxB rIGDxA{, /`΍ wxFg#tQ%,{8D\0NшHN4LVFpW,y O sxAKqqbxVM9XAGIpKYE@3-qˤVXX~R 6/A%gܞ] !3 it P#1Lx1(JJ1HtD ESlTHE\H1 Ib1v00~i/{g?Py..0mo1=O"KC{U*7C?HnF_ k&)LxEՂvU yGD@'? Wu u#H9R2*𧜎cnt|)zBs9^kRgf) A)?l4r4x_z+W)ˡU``漊//^Xbm *ySUrZ)*]k.akLG(&>K3qCyrZêշ[Ȩ#y?-;:u#7̿>uUշRNAԶ0{M>\|\G\<T~O˥7~S5iz< ÿJl cnǃWO'.nϰ$OA(/W%}%YۏiƩۊ߯@:qf㥏Zᛪ\ӯ .AdQ&%[ݛ-w},F1Nɖ[mg",QsoG)Rao@c<(:Vɋ-͜:7o| |ٶ L $"4rGwޡٰj=iIA|"5)i[w;6~?mzLDg$}`nܨ](hnXuٺwgi! cݤ[:t]vM'S66vԁ:exe9&-pA)gb(BܞZNe=2me, IpbOgX{?|>ޔogAc.7)Bg6 WG7}ȃRIIgadQ,ݥ[JvVM>+}*2첦Ϟ7>(+|ࠅ$a!\[gMU޾m=ay>,[ P m.{돪of^esMu,t{EW{l2,kUlM@^:m+)28xmݎfEdt<53ߞW8W\9dOL[f0& UUt+sZD)Z0la%{p-,ݪ˚UC;h"~?L=BKI"ᙒN;GR1hbD!G}' E/ o#m}&(;gƍ^ Ov"O Ecm65Eo9{˂, NKB! <8 Q9]M[=|B|xW*Ȅƚ*ab?  G%2HBK/c "r\# ?1k5f,`y:^nDk45[!-9n6qk[esƁ לm65|G~뾥Bzog)z܇<ޔͬ\7UTS"*]&Ƈ^c,1ÅLR(CU0qUz=YARMo`vTwfXb&ЩUp/Kţ inו֫i4fU_';bϛb[)*!4opѝXiEJ'2,& x6]*,VV͖*zpc_q0n|6gO.*Z}KzNH u>`[aʖZu댉m{)t;vm{EɼmK-d4_{uw"zrVl:qS;r6SլWۦ;_\-o&g17n2`')>[u'Z X%- CB0~3(CQP0Nya)8&] z!A~ɒ5MH@H,ÔXrHB*ȣGiɤy@ خ(1ц1*~^ݚIZ?ށ 5A3 e)Fl4X˅*FvN.w:׹5몏`+VWtKV=ӆ"o| IHvOtuC}R!=8Hj,M>P1Č2'=&/a:a0'R I @,vTOU4xLiR!3g.N()ya`"ZA۠FH+KKі3ˌ3LfM3aِMyZ&/oHmz^* h6q6KŖ-tiw^5_(M;#$PeđH2/F*έFF2 wrϮr6 Ť/4i#D`,pO sd21%T8 JB"XK5JD)5:bHETBXif;&ΞvV՟_G_nQTliEsAAיiD4pAnhf  ^j`Hg9MKpR+ :|/x$BpḐ(+ujQlTMgLw~iy,0$plhA48W#$ŗ/0C RPG8A'"d?DdO<f{UF[a)b{̪M-53LI;(!y 򭈀lI(~0\/!{s>+"W"38/U[yБǠfRuPv wmq$4$-Uɺ& "@6QWi`iF;#~X=.#ɚ4ۆ/Kd}E'A&aPr2Z=So3|aNJcYj߯"Ud9-h!"o f"gdjY=7K{AvRwn)%2$cP Z6b$[^jhwY M|MCiMuYq阅'cC&Mf2-9 )27%i|pt햔N1qV]1Pl6( `qQ:h%h_T9!XOr 9l;>}vwoڪ[AC8V5H#8[<;Z~` blzz\z d^U R aUSiMK zn_]owKy_?m 0b2=X+eue!,AM"burIY`W۴z-AWAګOyaE+89?NCw7lVe=7$n+R5p6PwX[F/uu) 3A*$td Ŵ})-v`衅wBP!Kyq^?VK.bH=LaJFK^Ybi1  !Q@ Q^.uJ/ 9#PL"mf \”٦Ke_%5s@I-ܒnԸo nyMzuXwE6Yo/+tqI=V>3 A?N?x2wcO"TVyaZԘ`6OLC{rjʅ!wSsv! Z9/XIJb 2E)$p{KCbً+o:$yBr*et ˚1"\Ah $*RE;Ni Kh],W\2M03V}D:tBe=B@ jCj8Ov<ގfd=aTP>#]K$d@`v) Ax`hǫUYr}wk&ʖRF᳠lȩDV+O)4V _CzPeDGRт^Q bQ#ryEt˷Ű競N}s=5ZEZ5@#>?r`[솈r%_9h~ȏ?YDras@nOjm/لq?@JLNITds ~8jT_➟I87;cx]xo"{d*9Y:DalQ+.M.e|PUoWrj<ߡ]|j}ܜ΍t˟K-h"4zh/Cѳ1G'[&6.'iMsŨK &X/}>ͤ4i=L}[.Y'~_l4?T~ݿ^T}ϯx3߁p[@BkC/ȾC3mN9 k\#=}S0u|:2;3 /^ӏѲOQ7Z.Lfģ &-bS_F&..IwCx $u.Īub#ARvP;P/]t+?3v3GڲLwΑ63X8Mlb(T>ZF I<ف#=uav!x[amLOڲæLzЫ+JŜB*⒐"LIz7S%o{ӉoCvvS^ֽ I1./dCq8XZm![T]m2K9z@[el=-ߺbbEʢ b*H'XQf!y/d̩VIFEz3,8Cڡߢ]7 jnU~,'hף7Kh"=ħI9w&7nq@GτUN|<աֻIW^O>Lmp=#iHZbPT-sbb SbY`G.}K6?Ol;TȽr|ZD.juАE xеea{*I;4ߖ U5, 6$FTZyT[&zFs4؞N'>p"7WB&{72[ЛeK??;f=5B_{_(M X/AkZGTb<8#ҵyfMJuwt;9/QeBVlTd/Qz6*NW2R޾8I|UUf{iSXh6`bc/:U =NyIM7S Ȩe+W[jAp&st |%;A|^ )hdRM$M^DP<7 :3#dʐ_7 P\|(N9(Mf!H+r(%P x %^ 2:O^SI::ؚLC&KM 7q@;j,D)A,cssfyՙz pmAkOTLLc.)a}1VuV`p9?o/ƻ@od[,Z̦c7G `^Bbހo~Lu<B /iܮw.ΏYVe1GsӵZ]etUxQ?f~tz?-˛@ UF!~ß ƹ*΁oQҺ& AJm h1+jʋ-|z1b{6==.[%56jǶR^ B[TV-HcnSv =m]y[iC=hjgRSl-ݭnOY_RTf+5u>.5߿*T}s[/lb ;]R! &-W i/Zs=Hm.]!D,,Z[-2yVG-@ Pt[@$۽|NZ`aKQ(COkcQ!6J[ĽsU/ g8['X3O5^Pw艫P7~)J޼ϳ L0/hW bvUbvUk۫;RtzUwU鿩"L?t+fzY! g YCM,[Ь8:;jǝYXA{0vEh\yL-RP;j_Q.{` FII#JBRKM>gME VJ)DR3EBIҗxGA>ݛ8<=נřЧ<@ڛj\:@*yM7uwR#8NN?mޜwtG`69J^!̬0lW>cQ>vwܟ7ég,FlMآl+ O]z2pIέydO;0S 7NfLS念K|ݾW!ϷL?g{8WY @>BDFbð"}J4D?仿꙽HqvW!9F)kw_UWWٌ/띻h eLY}ZLL[udTD|9J@eVA50,\<9874\8צxVE hB08[Y>+ W8tТ-]RjvNn[&,5^%_ۙȍJκ:EuJh]ɘ~B.A&htetՌ|{ZlMc66O~j7ܺm7-gw^ ~yxd;@W7>Y=.l9tLpϙ <%a˘#(i&Ɩrc2}mx9ީqJpqs_׍խZU>`'A"Jf)iᥔ&;KIr{#y0{ "cB&2YY' Cpsݐ]xvYܤM{s^Y;-H1sߕB,p!vEG`,@j/>ˬ2$Hg\B64v>ۻGإʼn4J0Kcl4O.J) M#+{k2ZKRʕ+H'-%W'Yv=J/҂I ]wG>ٗq&[1 QcGeEM4BYDc  &&QKBw$yqo[r'\Ed1s )+IQ~k(%E1PKt6VdSx]J (%Q.AB;͘1%OŐb)V:X4n\2C85@V ]ZGP['("rܓ1,rz0R!çY83rE*dl ]BɅΙ;$\.XGKA`TV!ĔVV@ pԚH! I ;'TL&.fhR>]Wxs'IQ߸bzȜ%P4"๏ K@&3'Rlؤ8>X&iCJ %BL&t" psDi-n~Qyg*k'F,dJ{2G fYsJ:&ei-wHS(' >kC[%!4)[;t;(Kv>pP*J_\o4uM;}>;x.E"10Gc&`eᓖ/xkE 2dEtLy[^ji^V۶AAڛoצ^Z4*w44 Lht۴ȵ޴my7L?NZ\ -pxϾİ}w{^u Gε&a"ʊ[QAF{MgD6z htҁt5éÖr%=dXgHfsL#63lxK LZBI+}3fbZkZMI'c% HX$@K;uxT6MRi)~zig lQc&jA.d1dl\DPUQؿsO3$79&p 0i<{K2Si.J,O4Z[ASSL@WicV ܃`@ Pɡ4$DR[rĝna7nx4kCKnّR 8ߚH2fDeѯ=z&4Gr>0>B{G6?7{%3+-3 D/[MpftOfO҉Ig.$ObiG@t+ůXh.=.<&|ԛ}b#G+OOW3WzDڞ/5_/N?BiVzy]фx\\v`K|*Oƥ~3Dѣfx ?/͕W\f䢉%1xOo5++jeׂ#tE~I̷\ړ4y.fu&t6 oF0::&i2Sy;oB;\-zs|:pc6ϺxȶY۞UYiWJ<-$L>rYM(CvF&P??6fM?'Q#v)_~߿>fߗǯ_ ZP $RAgS叻OYWS|maj|oWú[\| [O̎$ěя^LwYͿQx$x\WQד8$槵\v95U -Xߔ Rz%]_l毵H^{~>VD*%r|*q\9Hh;/T,(%g/" cѡ|6,T^J7/BkH1!: .*C.6EINt<*f3!% r, 402g=NtzUC3x"5$ﲳ4zy:v)mǚk+$e15oF~IazH ]տ?krʐ~O^LK1~L iJIY{k!ۏI?jIQ)VT#)S@d=ˈ iZj#xe\59@4OdfWOքt6 >1Gto"/Ox8][ru\ۭhSQ57?_Δ\LxQ&sqy>5&QeeR{fnQa2.0qtEؒԳq 쟟^+wFwBk7Ć Esv̕uTRV>Wy% -sDk ɻLWOvc{uH.3UĘ^CPDrVR̨Vu<8m[{;3lwn*uNJv 0F)[;|h ۩ӫ'xiwG6XV88sY:GG.#P5Xpn|v (ΰJ k H5M%0e18wB*%'~K6l6;92.A8r#C@QAq2,:9l *+e7܋&H%fI ݊\e"Rq'yrF$f"ƹ c9;nt{*r?] 8JK|aN]?vduG]^kmCs me,\21oWxT1ل9 pU*'hD!;eLYL)5r!&pcZs:*N<]oYO@ϧӋܖѱ7\sZ/7g8TMse&~7Ud:p!=Fv9W[R-9~ߓ ɰ.$}s!Y0 ̤)1 P)9Fh(!W,+Zw6fmd0HnN!WM9HQ1팜6#a5Ef}a8`l0vn%Zau쎬>^eS'W͑aۅR) U 2|u>JfQP Pww囨K%"rFc)L.21#xn-8M3Ҋ:gKԦC]qLـlB2""]$I=,r,QrT@ÂΙ$S^RQ j򦤯 rȇ/ag:R*TX=YX%qL)S܃G&(gZ{ȑ_iH'dm ؙl&~AgkHr&N0},;e˔%% DfSݬ*N@ZLe@$@lCY/NWݥrARJ /s (S &LY̅PPͿ2_.ޠ ޴ѿrg-5@-+2Jm3uanG sk/ң͇ѧ]幈"u&R7Y@>h=j7psp/=N?=bIiEB54$:>&NwZ奏pg1*o}TXڟ-M, T0xkVD%movp.H3I1({9Xa[#cCj2g"{Lҧp 9SH 49X砉 66NW˧{O"i`,HwL`;+0> W8M*2wf OP1HI01)s#yD!!(K,B :R9(XL;F$ j'Npȓ'NT0QN 6FCZhNg4:b0lUN$H%j@2@3,22-x#w$ `Zs*sz/]u856ыƝ6nxx/F׭$>$dt`CFhVfu19ijeݹ5=ec0 _"{C"@mˡE6XKQ yQ9BheJEFU !9G8 sYTI XRV dRTid,&zd,gb#cS,4  {\dxEOf|9]ڠyd6-34BQHNg$NXM2JvA)H'!ͪa%!;{6a L*/<(ȹ!$&r#sU0bg=byXp1 QWQZbW4J4AK-2 @1bLsﴗTfPCFːr= &D%D@)pXR.ˆNuTtUb֩Ef`D,6>6EDZiEĊ(% N8qXGJkkK44лUIIY4Alє73jT!eq-eQ{4pŬ܃󮋉s4_C\7rZgQ).BŊu>׌VZ M'|-9zt"(DF[ *.nMCnQa+J_'#׮6!IaSވAp5X{G(+_7UR޽ :&xuhxT*m40"vb u샕}pOcDQyhX;A&bIVvI B@.Q*$Q STҥvFn!W?M[8MN8xlЉO <`u㣚115{isYiߑ;S5F5,qCc ǩKUc565fsU|yue3(o\{ŸIτל)s@cIQ#]G4EXyleJW>f3zw{1dӓ4gщK-}:/mz([UzKMCIb'> S)BފICHA~#DVD " v E$ K%"x\HJ "3QKQHR_G84ha%L]p#^"fjc-lq{7]%٦)zL) QiDyfqF*jTdbSxPg%TL-.C4&7Br+JaԊn͓Syn}?w7j~cqZg;FZo: py!A$ I•Scc5@<%_dRU!)}M:{WᐍuB(*ٝb!d!{g}h'_3 ZsE yPRS x2مR"hn XF^JeiA_'9Px @=آ Bß~&d>7;̶ZOz>4y/#M?1?$ɟmNi s2\b{'6g< xBP;r^ xEÜ|ѿk9x(l5_r5[xy5uW8joʑ9_ b)G51<f;o[߿OZIOd#G ,:W &?/Cgv$+o>ڝSFqW.r=/AͮId|FϿ~e^ '/kl<;trCi~}{sz2|oΚφY{5'5gFePu}2x6ųH6 |t,19| \<A謣&i4Qy#RW+zyIzơSH 8\Oy;æt[Dw3K6 Λ?԰@&Bbc 65&Β`fbo'6Hn}T[Q*DNC^J)d}eΝ<~\\^xbr#f]L':yӔ$d8m$FJTnfFXg6g^gա8^$>|m! ;-͠.̎\4jv:g?FG?@lC:sc>?>6y~ ho.><|ˇ < ڪls(>L\采IvҐ+;NrTSL_i9Q5qϭ1zt<5Ϗt+hMV<#Ù$c![]v !˓pf:j~ΘiݙmY^loZ5\S=}tbrm Hy=*w42y==侨77.~sys$j/__;[xO3%|q|t;ıT;7;3?؉3qMFeF$,E)ox*6DUxF=:s<7!Ahe݂.b=zD3HHDõqF _k5uu>W7P.3?7eOw&6S+6 h $5OB$1 ګL鈥&@3 9Q+bMh@ 3A(Em>8Μ6;v(Ca24CBifbq~{7Ǖl,ɉopri_|\?;jO\_^;K MnKahD B)sf 4p+-J /s (SJ x,DmJ e̿ܠ ޴9mj; j.6@6 j><{=\}ghwoY{{caNn7%rǛJWn*OZTTTuSiWZO,tyhawW`7!%"u&R7gD?O`hOV$TSMCRĐ|LF[oa)";㌩(olW5iWdQ/6y4r!f{-'a}[cvch^٭kNue֕ݺ[Wv#WIl+ue*ueP٭+ue֕ݺ"Oe֕ݺ[Wvn]٭+ue֕ݺ[Wvn]٭+ue֕ݺ[Wvn]٭+ue֕ݺ[n]٭+ueW٭+ue5{M*+uĮezH_S<"32 Xј}F#H[cT`FQ(H`Y*&"*a0zn=L[ӭtapacUnoƼ cczCwVU-h}:v·\ %'7s>\Q FZڮiS@$c9) HKl:'HMIF/"75]h_>yKי`ϑFjǷ4>uD|L}M XGx6;~7|o !W}t¥Fi:母x۹aho`퀴i)*X ʇQo0,Фٝ,QtB#dm^3o{ԧ2p:ӛ}7rYc-Rɉ:=9M1gg#`2J2m+(YS'9VX&X.YFj=ܲ9{F}T٧08u97Ƥ5#$bM쭖 Jd%"K 0$?DM@W(-K@e *DYHPiHM¥Ƃڌ݂JojV7 j}G^~:V|S6U||w0OeŭKWӭf+ޒ2XDt US}Xc$P{I$U @,;U/{fևS*ɫT4b38T#QqЈwe&JbB^]*YFN @bJC(d#1}c2/T=3# KV&RJH*&hڷď:^Rՠz1{a fs8; 4YkLƨMI )BI(eЋOE6CnaT؎Z` |pFIpC㑲-mg?o;c6&uoϹAcQErByiC!T'7ܝNAOu$bP ,B',m.h];a@TG3dL|OnJk(DvS/TA&)d9BB֍;nhg0" f; ۞F||B Ou2[/l<D-yM׺.Přq)M>꧒,/ؘzɫaU᭸C{,yp_OXXYt?rxޏZ|`jWmE4*eVh~K?eYo3o#/şu|Zu_?>w./ _^lz[*slzݰgOdg_./X=ٳϟ_Mdz}_p?-F_M:g<1hzwxir5G4|`<=[?}f|G:c,0΃|z9* |f@ZHuE<s* ҄aK4F6 CmೡL口'Wa)<#y R`?VpjYP-1uDM~g4Tb$]432-HnTϪ׌{ZY@%mRu2ݶ%3db.E(:;UC]ڴfxt6gn☿je |w m=,p}K#/.f]дؼm;gHv9A!Ng\w\pPȣ^'QxQdzF5Cp|THow\!]~+ZyO%!IŗZPl;o|9\)Mhhe{8tqh0G NwN0TGٳ`ΐF{i,'F{-JDU(kUV.51,BeCAOZ&o&97C?1ڕRz鵳14QPQ43:V*Rۋ`Goq*rUu K 3tf4 |FV|WU)sKd sANHy濘Bɇtދcm8L藳,']yj6c0+a|'G8o_=YR+Nx拴ڗZO1 ?3}-=$ȸXQ`zP3b>jy )_Tܳ8 ov[p2orA]9cNW?nMV2n\]15w6+uenKU9KzlnWw^s<۱E{7ʎ\5ٻ/޽2gy7:&;^[$>9v\һߘu$k3͇(}f)&>ۿdY^w7QS[FkRoH+VPF.uFO")|?v, Z/X!J,Ijt"AEiWB1% >d5I4`ӏLNz.~ohyanSy{+ HLuYg,l\6"GtI7b8E۔3 e)/Q+Â5AԾ5ޟV% uJE3rv uf=aұ)=>*RݪI7*qgzg=w=b,F …-!'A>i(sQ'}8 _kB'Ԅ ylm9D+s"Rb`|R:**9/k\JE5-K Ķ=FL!HJنZI[u5H.z!<>"ZKO4#gɒCOlK#7bl_. mV `y{'GkZwKVw⩧[Q\/(TL2e_VkLd&:X4 Ā  $wx~<9/XJAZ4ILƪ?QJ#"KJ/߷XQie0HueD (!z_T1+UD𺱜5#gO93Q'obT>א]ccCg,htY*Q c l5vQAAZKE>2ۄE34)|Jf/He )vla$TSj5Zi>:!_F]?h_He6$r .M }5A΄[!Ja5,.d\iG1ijiɾ&^F{͍>G[gdGp@E`6=z0{mkh}Eih)ikaڮg"|XToBx$$SĮ\* ^uc.2rhcb c ƺSڌCX:ܷ-h{_kN~u֤HTUBlrd6R4O6L/]o9(%Ek3CV.j;6 htw))RR !%v5hNք<<g D4C{qV JMqj? X[1F ,;02vFهdcڌ8u }9]O >!2ZJc)2Y J ֽSr iC[Y6&m22,'٢ͤG1cM5I}VY59%I߸PE%FϠ 8\*f2?)2Z+wI M jmuVqűll" +l$WSK+b6Yb_v׺ENʃgniktJ2uE'###E} T|B9KJZ$^ANHƦj%׀!kp@6Ucwp5;N7l>t?! ~XXvK..fb 0=4E©E6 E69(&DˈڛcZX`cu V+mhė|%gd$؎-2*mr1cp6@feKH3rv˥^>N ƦEb=6367\x& 1B& K+0Ic f«¬겛u3]س ZQֿ< ЇJy^?_Xz|f?]ފ >7z?~w?}#תGuԮFޏJM%{,=eVh~YdIJT嬷7YWId͐}_ى_Ba^XY9HL#cQ,sf8twU}UJʉ EUzwS ܻBG:-|.+yY-?9*({7/]^*X/Uw~-7ꆪfezG|13GԤ㧓 0d5txyZ_GO㬶 "dž8"" 3|n%Ԁ.pKcNGtښMgy(Fui/ O(!ށg4a\"ZS1E[/dô>l;w74˕7t:`)rsְr>ZGl#I LAhv]uקu;"4:&I J3i1` ĩF1BRSr߿5ꅯQ& Ĉ$ 5Ej5h$J =O48r0f AyhJ@\ DN!T! gTC"@h1hup}|̌gf,ǒqS?,UZhpYWl%?(\5xTZrMeYt $ԥW/f!,Fwp6.g%qR;^*@'~/_3ן\:dtͻߓgۓ3LN޽yp)t6@H ᷝ09h j6l!Z&K]UJ yܛ|`na@f__\[o/&hӼ,C9W62W@Z }A! ْG6{*C8êe;u|+[土cgHc1F1sɭ`^4蹻iKf~3j誯]Y/g޼gʛ=/ҤҠrAQVeB8<:D4TKHg]h 'tsx5Z`pQAsgT:Q1,%!UF(>18߶[O(<RVX\JhwJ(0HW25p}rq]q*UΫ^vMryXx"~tS6G=&acaڣ,-EFRt/`EZC mi@KܝJ4iZ(t`y{*-t꽧+ѺkqCWRSDWiUK['bOL(: +~1]*V+ JhUBIS /Jz`~p;Zاo\)uնM/;Gw0'%tU¥tU{NW%ʵe&{AWt%tЦ'?Er*ֺ5tp9m}]@H),ZDW nwp5RmwPruHW c%x kZCW .nM0RtP2vHWsBT*~*U-th)NW 徭:ѕ@kYeLQ4{63#o_b"gsw *x6X|4n<ג)E 0wjdhm:esBӥ!ך s#FN=&H]S]YfЋ.[78Ŝ9s˭:F jgBb27,¨KXwꈸ(+$"1%PTjCĹ@XND1*CeZj1U{oeX:aǘH6ݾkeD0EOpqkfZ}@$\[~zy9"> "UJhJ.tJ&q( Jh%wJ(? +MEt⭡=y_̓PCWj˦WD4?vu?JbKWBP6P=R+u7=r̷<5;)hU}ΒZQvujGw2/|U3X!(NÙgh99x xN\;Km6a%s!nn - xRˑ0O_~U)qՂvo=ݢؠJZ1swG0P:p%3$*% zMXқ(ty v rnBȕ*LO?eWE (lvTI{u }8pn>ݛ甋xd8±")GD ( & J)јot |MB:zeZu§e?`)ecRYVfi. !ʓH&7-p/ =P3k>zncFq!*꨹F%EԠ'}@yl#z+5z+wPgQ&O5mcHsQ[01Ej5h$=[yb\~0f tZ߿Ma4Q .tiaBC"ΨDajWbpMLA#E?U+x· ΙC ~%zYiktazrʱIy])QtV>gbλp!DK0C?EYTjvmp$GGǠ|˪+&I .늞doJRBnB3:K{Cm!I0y<-,f!,Fwp6.9Key Љ˗G'?=~m?;{ uv㳓wj`t 4;Cm'_?n/ZDZ4 [֢RW-ಭ1}E T_ E0w0 JOGMpx< B8R^Õz3_`;o\kCpj7(r& fKnʬ̮^2ζwGW}{Hq#I! @18@hde q'\y=)b;!(xR8puzqsI_cM P0;/qO[rwwC6?~RyÐBH3 axu$ąG<#S

423 dG-ܦghL'3-GO_k߫sv)-s~ڄGʻ޵6rcٿ",c`1dwNog/h`7*oʍۦaO߯.}5l6@l?!jfl- iUog龈,5T*m3ū\gItB=;zvkpyg4:4Xt"]}zĭN)FIE&,}Y)ԾrĞH}P[3G:`4,qQDԙE1 f &^s14d %]T*wn%H#)8Cf"s!d+3jyBl.DMb:|hnv?3[mOh.gg21YGqkB,j0O].98x8`=&tLf #.:=T)ox8\:|rȲBhrT2Tjo[y֟,Z"]{_G;˾bh;>E_W y%+YvP2|Pxe*r!Fjj@YLɑ~\Hۅ\ȃ Eb)5P+DLJhg*1 gc*9ܹ48N>r|䬁Ej&`fXB#seZ9X|QN~doҏi~n)RN8$+B+Mc͕mOc+(n: U)(Zis&Zi%V viJ+%Ӯ}i׸Tv-}lT:Q)/,y4a2ki@L9Gj,^$`́9#AXr>c ::#9]`.bYf1V1{;&}Dt^n^dϤ@2b7Ʒ޿y!2ɟ.1=alW<$a4k>5L ei| Pcݝpv:hWr܍_p:֎3Nz'pIu#w)}>u9N.YΕR, nTm4(xHhpuƃ+ 7;koOizwxr51Z=w-$fz.4r׽N,Kɰ.FӺUZ=dkmݠZw۞irn]1]-7N4|5q!$T(utJڠJB.a @HG'z)dK]7UpbnP:&~K d:-Njٲ/ Mi$ckYr--UV1D.Z+&mN,d$F33="< "=0@pB:1BT'leN&u2:<]|sG@ek|לs,̩TdKۣܽ\Tԝ YsK#jIs.ι7߲f_~U)[dHENGˆ2F>]؍$xw,);2<7ǬUa3#ߪA<N+% &T-DE@g]oGW}`=RwW{$%O[kYRH^>D=(Q$1` gjzzDɨ2H9cq7Jr`,oQl=oxnq=l("5}jAY"8}7fٲrCAKCN! I*aR_弙*g%75ynyg9,}uެ%^!$k+]xqK6)t낏-%,Tj RAȦi@%Iw >_9;8m> A -] 3ق]؆W?*d?9S}Mq:}0>_Kqz8?f\ Zj>7Q vNzT'Z3*eFSVJ,U9k rf`x ^L+BVھfփ-IՅ|> z/>wly,/0Pə:8b?m vW_Ϭfk_ꁎ#-$N8R_! BZRw|}r"+ϝ0? Ro\x8NpSܸ5?=,?iR,p6o lic@;a0l.NYncKbq^e@TS jM0:T-d;D,N]Nvv,n%Z*J[t"brW|% k2Db |;o5)z2Ӛ4V{}xC'h DJ$RF]B@DX=&\6.iYڃP8|c5LP ϨS}֞R[]we*\H8i]$̶aZ 'E6.\1]ؘ>)|[l;غ}gz5ZX͉t* aXVeG}Gƙu% FbmD/to 6g-1 Bv veUƺ}4ƂD}3rǟU شZ92پpu%ܞܕZ_ ͒unO={xB(.(T2&**H Yhᴰ̄,BzQe%3ݹ vIp/gGX*@aP b!3AGh;P&#P85P3/Owz?im[Ie-L:Tr}5jB(%@S&6DRæAhtR9n)%xJ޹C))OI4#sot8#[Au6IBTFQ(yFQX`c"yR`(g焖3:ǡdBr,4ں+lޤLSVp}!Yuևݵݢrej}:5%bc\HKj8L*>6: 6#㋊ ")0Z#m3>{<зm>p HA;N_J:oxO`3X]r1bs98-)YvChʪA!)3)rOسؚ㠞<hK,= Gy4ʲlAaQ{pN(0$G^٢@p RXI)+\xE;7[炽ot6w"WU!hm>k;V`pۼȭ޶^zق-J}Lo&?fӎyڌE-Si4=Rb1:?S8ooQ38KDM͛μSi/,45AbUͮT_ZX:}}Y>\caLU:"a,ӥ(*i49 %8kMĵb퀁d͍㻂&6=ԃfh!ڛq\؛qZM>RiE?#pm3\\V8j)c5কrZ˳7ǃsOjvp c? 3_ S&ۣYWM3ؽQLj_tvF{}5]4}@9ifBᄎbQW\eE]UjuuUثg,nT]U]Urio`nz/x}uU ~iD{\x{*NbjA|vERQ"5=RW`gF]Q;b_Sk;J;_%_= m~I[6zhF"u4*ݎ zu7BokWmꤜhpᇣʖ|%}y=<M~=f؄Dlօ1 z'$(ݚ}Gpȷ[ύ"-aպ댽C'߆n0O?|Ws5]:6#o^,͗. PȒ2LE&VtQD6i"%䋍6Bmuc 7)q{)YkOg"58^n^2VoNx_|/=Jϔuz!Ϩ*f`Cr\qP3Rs)*gR:aE($AuQb/Owvs\ya B2߶ `UQ|΋v )$>~=Uw[1y)$|T9˻mH|*ruiU:9LxqRgv=XJ(/lS.K .j [nwݵJNhy= pdlE'XgL,z6eeD$L>wa&Y ^ˍ++6:O[H "N6k RI9*-pm2h5̢HʀY_[پ TQ2P"Z,F˺@u$gkW,gAtFuФ3Ty}mc&;ڛwcauA`&P\ v<`А8T RZŞ=.i!PBhY<|:[MQ "MV39)&ku/H *@ "TFq(' zD&{@!FiDbX?QiLJkUk[G>毎?va(>6p/<}k>`}+ІF78G6ǃ_~y֟%~`RZ:@g#4MJjZ`Gu3cWC q2h߇OKV)5c#UBbI&hu^r_qMnv_Oj^Wgy5[}iy>dGoH{F^N\CG/?jjʣa\$2ػFndW/k}N',6 8g"Ȓ#s[?ݶ,[.,6-YWBZ'.iHm'ݛ^"v+?~^οxN D0WSwٵ͑p4ʹͬ&rhigضfI{֍ p+,'2e =YٿOx8jk=!YnM$FW@5{YӨE%niTӜ}X8܋K"0͏?T巟/~ .śh4Œpgg ]^ko5 O=]e[C ߻cJ}gU |a?'24mwbfE/{ Mj~F&Zz{4+ޖGk" Ƕm?cX?hLFMFqhe"LAFCǵ(EbۨBQk  SWٰ՞/p*QdMyì%-Mt ڄj1%+!E<;dEiURȽtLwkF3T 쑻 wW>iq޳sNvm^mۤU]'f1{gQ6뺋R5ʴ*Imd &h \X\sr&&9:K(`cQgzxL)KY'zډ͓d<\U?znlW#g_]Ϳ_:q?NZd?^%vݹ8vDH0^]xIC3 Z[ߣOWa=Fqeoy#9bFi C]홡Gp39Is䮖Tg<(QEq,*iY0-hI?r{3dP;i\Pk 6*Kn4ЙK-2>3W9V;cs=}(đVw}lABMGK^yf}_ԗv rMdxƸarI 0+4]U58:NsDɩw9Am6'ύBIHJ_Y[Y_GivDb_g˾dh[ -6Ek_>`Wt,K0շ,[(nf2ϯw}IG] C.H]C&H)82$C~ 6!<7r; ,MH,LTD@ 8B@SZYAd1fmYkfLJ@]!6ZE,C(kAB2uQzICkPCL@Ç.{-cZ{[)k6qߧY9+~ql?*5hl;_%M3;Y'1_=_URiu.9vR2y +^7.D35s&%zy^@Ko" I "@lJrGnedAXdMs"9KD.D-p$7ɖZ֘NUB64Ʉ֬2]rUUATGgIDOX4Z[׀$!% lue&SK=q2sLiXP CT.;͐,x3HHBoe':ҜyLTמKIn:A`%L62uGߟ6d7hb&ȹ}%a[+==m̝hcN3& *R6])ZRT%&7XtV+'M7֐~|#T8MHD Xm#&L 7=miJy"B@&C1!gEDA8_ZN% zfux)Wȏ"`j]^zAnQ~{ۉl۫߇->lE<@vOpS`0KV8{o; uwiB"lxl&YhLO+, Ȝ `` 5>$Ckf%?;epv)ag&ZEmRѲC1xPhDPIοg74~0}Ȉ13"!d?ɤlbfϘ!3|Lmy5-̤B4 ( @mAK4"ێ$ t88;fb jW{6guQ{^VǩdEIg UCFLVlԊcRIYi%Y $+:B25X" #a}Yd| ٍQZ)x.Xm~싈2";D|0n98xi")y>Xuk,;>'$Gnܲǐd'% 4R3Ȥ,i! xe!NUMG>^&zՁpq̕:͒}qQt1q1Ηk`ZY.hZҬ'IkP9:$/,2dpx.xXmv싇Pa?<<-S$mry2MWs7EZxG8uޏ'~U8NfuJ" Rs.`W,d`Q&'LF4kߠԹ:vԻgZ@=% sAz »`dAfUDE{փ YV*\$ZJ 之Td; WtȦsRoYʈL5.:Бolڳ#A YՏyi^'$bAE+xYr ?6tˉ^ \pPu3,ӉyrvQ:'~ @'4.tT[[-R$WFj1iӁihE6JI&'Qb2z(X"\Q($.d!ȔH 9YSIѺP Mk> :7[>jS^ 1x[Uw[kxyv-fqX% jIiXD%=&: OQ&YRD57*OAV竣z ȼ`0W, h4L(P|R΂!WL}8ȸG?GG?l?UR so}WE_K{?C;Z'?eHo7!44ͷmz>N7qIY\W=ǞRfzi3wkCV^ʹ;7~G ?$yۧe58[or5mUij)h>U<5\wx%I͕,U'V_z8j%j2Փ6|OJb f__hu~1l/V5>ј4Ahƞ++qaqsۛ_dڒözˏ.Lvd`ྑH祀UdC\tGϧzuO)ˏwmGlSK}JĊ.KG0+ rT%ڝyg'dЯ~o۽Ssχ㫋wȭwՕ;.Ctr#~ nf+M?}u&HRoz/m*\IKOn{OwXǛ~\.)K:^ mF =Xw\>fpQᆴj0<$t]p, +M7;wvZ?L?L \qgq^aׯV揮/JT^gV$^"lBaeuh18 66f0ufWZ|O3/~7əд@oZ9E^?+k4 {؄~~t,L|\Sg G?L*oxB)UoLͿwm^ LLJ&\BwfɆ{.y7m`4olEjJѴU|fPav3h3ijG)7xPmX޺m~!"S/FlrbsQ&(#hQ8.(RLpEPX &)NfZ'ǵtrjǼu {81H{k'3F )/#/Bb%PJ RwV7yvL^Di2z8R?/`~Hna?q5o =m~R$L$@$KOր-%;Ʃ%qf`9W 6* &3eaIέЦ2\߶<,_ 1 7+yfuYUW?עesh;=[ތ^/-IzG$H0>8/@OÛ|5xh@G+LԃEE4 q04fN(EKf-D116,9 ' ]ETͺQQ%$ZJ!bJ:m EͣVL4T`^(:Dm5[<zY׾ҝ/̽bg h;6oXEo(RSx1 Ԥ9 w~0soYԔz]*T:f0'ӧGEk.>e'YGd9%&jH"јrQ@v;͍tLx4LGN*CX1c +%豉hj4BZ":F6x #k^  J%2HBK/cZ=RZEQbwl֊kJ皍/ѓ.^"XKs5 *p3_Ln6@ z㷡E>]A3x V2iH U%ā" "{I=g3v8 xb7^d`'Jhޭtvs^PzRnp[3? ?|y [1zi>G B0|(9O{ 4yL;g1z6(-GF>~Pc!?{\o=wY[޺;ӹwVWIҕ#šˑSXokS&.nnگ6vw}}f;qəe4)&&C9A(!bR\믃u8CeȒ2 J)/)5sJ(I2~\ǝ#rq;}$K8#a#Sb V1O# h" O&ȃ!^@JtD_~G˃I ˕_nӺcjM c#!Hjcʃ raYf_g:Ɵ&z4`S^r{丐˄A5ICݜ.O3u%}qO:^5^Qg7&rD:켎@G# JȼdZn8F (-3=:.,8I_h6F9(B3,X`ʝrA sd21%T8֩ EXv0Q0 Rju$rŐ*"Yglig]̬-w&j/`K[q&sε}]gzaрc%s$HxJpp< #4EG> 'a@*z"0IDTCe HS :Zn?"g:Io]U`Ib(hp2@FHZx@(Fp3D਌+qt8*kg:SULUY~bo3LV0ob!&lV60%kP$$}$4#fek.#=Q(|0tX39koaF̢^iZ1!yIʃ<4㔚 m6cf{::Ӑ]Cul)&A؊n'hRVMf{vlt H(킄@M5g`Mnܶk.hI \ \5%j_Ѐܓ$E~tKR-IyIj"S<)qύXNE DS dV<1Ec)Xr-0L:7TRbMʃK,4x)#TƈbM5uQ8o[[A:Rb"J;f2ˆk 0zI]t]!1Yx%(T|ԟcԅuYb=4߂>PaRRl4vCmYVcaH. åRD/*!dG(o4vut\RlWbOGc1Jl1,cS8$p(SJkq: 16vD66.wmNC[)wG2()` g{nr1#J`ya9VNǣ~ )2@JF/tJÃ&\;qֺi89]^P0vY4a#A29HysQ b!pO[zx-S0!CE T=_Xq Bϗ% :=K|B \q%>wGﮒY쮞Xpg]3SSTu{11oghڙA "D{5Ĭ( +U&lY0/3/_l~@/ {z.M{}{ޤ5S+"&m*&zI.,8Gt۳3j:RjKf(L3M0&YT}zv_9N! ։0QXC=a4r1AӝIa u4T&X]t`<e,!]Zn5t:'1beP{` B8BsqrX`7WYeW h,,`◜^"Z WϿ*iC/A)kxqI%Hs ϯA:BXI-SI-$i)9BZxB,{* ɸ+!q**IK$b쮞Rj~JJJ89wTUVcwWIJ癟R #qJ*RD3Ϝ'JR`I+(9>wˎrWIZ} H)ewuJ_"MקvH<@xT{] Ȍ'I-d%Y\ټh'޴$rժp{= !eI%cΗ:"^2#mofxY<֙8/ò0`pV,~v<ց+tĞ!5V9!h7Q:hecxu-Z=7_(߳T&թj󱙼n8}mܻc~-1 MfGzxEo[MŭЋu:m=ۖ}T6Cg^oA[\})J!-%9&Xm3,إSJL-E3h㘩E3hԢr3hL-E3hԢZ4SfjL-E3hԢZ4[[ԢZ4SfjL-EKj5= o EratԢZ4SfjL- 2hԢZ4fjL-E3hL-Eq:L-E3hԢZ4Sfj TBi R 7چz!cǁ*颫N_p IUp;oB!*!@1gz NVFq֞8k=ĕhjt^y3/ k>"bP@POA* 4iۤ4}.a\y sZ޵߂a~8/kKLReKDcc=]3WL{ uVgG=zfi=.-d\9'UFg2%Vfkʽ[r+YrT:ҔΌs.ERzY"Lh1ŝkǴ3ؑ3r_FޗBs،߬δNiī ̫fu<&gOuow4mՆ`?{T#8{E9v ƼDZrjWhBAas>W]R͗*| T2zt,Va*yѓΰ=E֞8̖[RN^tXg%g.[<{ܓf.c T vZFRR23"ޠM rޢ YsDYUI1`팜o 6'&,Eˍr9jfㄛV')%/2ƜQcqǘzڮR!.7:5pT|Y&n!4qy5C-qXPt܎ҹ-pa*i}^_igĒJV:&Q Xe7 yOD}-_V|Fۿa0[[KAA{j2! T#qj=V+I}NEi=M$"ZpkC myjd<'c8&O(,-{Zꋒ#l$_\8,zL+#!aZsדmK+!-7YI-OR>ok'Ѱpv[SZ_N2x}wbߴdLE+]["kǮWuI5v%r#iKb:0r GC6ҿ>H+j]Ͽ֛Mth=լ;fun7-w>3=Ja|y9I ʲGttU͟:~ek[s>y]xwKmmRFs?d^̤C"X/s1Ѐ[Z ΢ɪs7J!"T>FK\gUV*!K+h)F48mcphVlq{ҽ-N̥tuș/*3KHicK~X>qmfM&$7A0-d)s4>&yԖ 22,9".q.9;F ^Wt~j U{XkqNJݞ"e@ 2LIs/0YHKǿ7%8v/N\N"F%Hk$1gJ:F^i>@j).fnǮ/]ؾiE%N "h( %Y`pj;*<⾕ltF>Д/^dIf\;絷JUj9pJE1H11\Y:B_d^2\'5QNc%IK˲̽.N3nTtLK䞑ewA DƬ@*4<8m JN nIB X%$w;nIg  /urB7R{*7J~L/3/>59n5&gLX)~e7_~pF4YrjͥD岎?8?8{9gR|$尣+8:դ~!Gn2Ih2%xqm5{6j$'h_`=X_ vpj1R_?Л7Ӌ_U\I)4j,7BGaF4{+ kOtN'K5̌ apgFփy1я͓og_[U# 1pχgr\`t9zZ'yGoiG6֑\?E0uuխX6e-rt~;_rparQlu>ɶQ檬Q'żR}z9I\Ax_:MDmK?Ά h;OOoO|'ɻ?;y[:q@+0Mdtj!}4 x?C/ivڪ.g=[mjXLe܇cp>5Ł[3\y;NGhYE*zq"NaEM_﯆lEEad!Gڵ%⸏/6⣭h_*N$1E(v\4(}CBaP= Hml枯hpxnw{1!:.*eQE63R2HS`IFtNgWRx=q(e6rҋj6wժmev$Zş> _T\tELB:c*!@V>Wy%d)cQ9seTϝS=[FNCr1,hcz,@P\ƍ AeB>:RRB^G͂S=Ɋʼ09y(#g'5m7 {9tb՚k=Csƛr[zέcvAH};tB,{S܁zZbIYegZҌB4 E3^=S#wgI_5GŠ pg= +NZ(FJ(ɜy6>;DžGPܫ󌈇Ӯ']3+V议GG5mv ^*3K ET1$sVa3n4<{rp(SJQYL28lȲ Pk$UGLET93rsZ=;O+].ZS9mJS4qE,_Y'UEqxȌow\E2˹&؊:&o HDaCH!Z'{=iI9Ijn>F*%>ZBV+J֡SflDc!iweI#nN!p xX팜.|@*'鲷ie>n{Q^*3Ǡ%m*8h 2y^G^*~@R !1B]D<3?-8M90:1 ZH>/ɔ aϖYdJ4VSH>L} 3˧˥^7Ι$ypbXg9Q7ۖKgb4:5ut]3rAL9,C̉-0R"}N1^j4r*}*$u {fy f&\OLs!BbN9L9%Ԩ *yֆ Xq<6Y iQj֫ݨ^fE.xxx֮6ȡų~'QKP'`^Oozmy)qNG0Y&$Ɍ4D˺=^Q8Ϊʺ"d] {pp~]{|S_(pz6l;v#->4w2@iC/j? w)}'rʹr=KW籖> JhpsF)+ 1Уik)Hol{ͻfbQ-߆B%ri~Emj~Gl;z0oW&n|ܭoP_+fmQV;ö״^HZ>Ensz||\lNyw^Ѱj\`'4rӁ%xYxP mgUDr^Nex)y\<8xJqĀ;Q}0P|H] ҆Qb, :R.EtJ.%%33P6rIB$5yw$bHSڳ]|:#gG"B-ſD: k|k眶9hGA&MBD}r $AjPJY6mY)ͅ8 @p^`br,`Hh%9d_L:jWlgt:mZƕ'X;]Ogd/Z=|l4օeZH{}'u}ٳ-=K|k\<|eDha`ā5O<9N8(c'1uK4%V%II2,ԑ57Qp&I,(NSP3QF E=Τ)W*:߶#ckp`7|l:W:[Cy8S>020-B_KR&)1DhxB O/‡Iǡ|ZCvց^,e5F\ 6x(s#v ֏d(dQŒЂc/U%-Hq0uNN1✐ і\;j!(aq>D7YNBe,G46p\8Ir.<%O9Զ/rkpv-J̪=gV/S5C'~@Z'aA|&S3?lqRn\Wm.lR5F,VƹID.Q7('l)NΫsRrT+Ƚb$gkΔ9 X԰)I.рW4zIY L*x8ڻ9_Ut(*gFա8G~tΤ}٪ OU@(.5( O;" xMcҞ 'LHVL B !r`0BFb_{D$S&Iʞ(U$FDp^G8qanL n`w :;0iB nAw&r[9LSNC%3  JE$ K !W@<tly:GKDŽ ub™FHsC0u6j_,ᴈ͓,cA<3s'AYN2/$$JXﭣRTCDQiUgZl?P2 ^JQ3 dKR$;pWr1+q\GDe4 'JȫRij 6)3J"!Y=4~Ckt;޾\q].lɍJ~+,v+Vdgޫ}c f,6,JwD?l?H?f餲SiӪձeG/r z e=vuΙIPe;/ϊ~N*MC9c`݆Y\4VڦS;^M2[TU9t,l@\'M#lsۆPZ;; 2,p< /uO'ן$,i }ah^-b1ZV\t6B:T_TWn&Y6[rq\S q1k2km1eM~ULtMݻm#f;W,ju:׏+)޳YFJ|jKZk=u^KHq'slKۏƐ}z:~m&5o`&Ԧ~}H?tiJ')7_ll1XV]\}:RBPriwylKg" NjLb]B3d{5SύԻߎ&=x%] l>ذ&!5VN[*I[wxٮ_i:oƬ,j1u(<{2ZMݛ W-;.lL36V2CWj6O4axW1)g\2NQ@!li-Y>!T{5'5 FePÊK+ nYB-mhZpN렜 ct(ATH!,>uPd:vzD)F߱?xF?gaDgt~>WڋZ]foSjVF~F6$_W\%|r:~` b7S>jOqhYq(F~Q!!K})jȾg4Y e:*Y"bBwWmޞO5aŷ?|_+WK䭛jXf]}@i,W#hԁ?M :)u5|E_ԬP 5kh,aҦ$Y,%2 ʁ±Ψ?>g尉ow3^w}ׇA)Ov}?wUMELBІc8JFB+}B2Qq(7L3#S3\ 3/3І$=K׍BQShYqF Uh3"zˊKFHW\2JÊ JYoe&`݈rY ړ z.!xAę`gA߾, xQl%3@RpBAB<}x&Z0Pp Mgt3J>AV = XAo*5/tUF@W'HW0Vw0' ?fuUF:E2Dh=+l? ?Vv^(5ꫡ+x`+B4gϿv`A3~p3~h5 $Ӯ`mz`Z0p ]et(@W'HWV}Ү7thWDj8]e:Abv,]!` /t*u(@WHWQ#ʀ ]!\Jd_*e׮2J>)ҕlŲnh\Òً7v_&Vfd[Ξ/ܓWad'ro|ER%r9ӋGK\q~L  Jb: .4Z8Γ”RUagv4/ŇlUn|"J٪=zo~a+䌄(K5"ʭxT#3 C#0ͺE\ eJ)p]P9u*G2Zq@$|HJHKK &ҏ_Z(ٰ|K #t*67t% Y h):]eLtutUFٸpUo:]etظ9ELizDW ]e7ADKUF:E2ѧ+ ?.].}Vu~*܈2U>+©j/L,;TGHE +}@l$äL4@ĤH@ ۜ0A)ű&@9<e+nEˍB5mU1QB&i01l%0yF%`)MfpR9*Ūd?JMRGTIxID 9ԤDT܈FB1'Q  B;36cTJ3Ixin4P=$jHaD 37_N [ A@3(!(,D)V)CiPKM C?; ({`6U&)˕2K 6?%:,Z0HB$b;j C ?/C\E&ZizKPHơ;dI㬐9n *MO\(y8onWyP$% ۘ|p.#hҸAO \-]+fĥ"ҌY7ICQŘ@QE!i$% !pB6}?*8`ā\ެzi!uܩ]-^6u mz`-$t>%FK< *}t4+&{H\!iZTUFB1L: !'q֣2E ) >@E&rZ5ȼ`|ڄLktqiy{'*APP#H܎m }áEU'US Y_"bΊf lZA]֊ANEUж nߚTwy4SU)VXB(;KSv9(P"Q _ !}CGP=n;H H j`#`m-ѧ@hg]+y r.!z-bB bjw !T3ڄ`ctGJ߳yp (tg@ @$B2Б5;9v"(Τ$1S(ͮɀ!JP CEdU5\J~,*&a!dE(3A68FtblSNX)j^ ?h}I[H$Q5$YI$e(m@Vӥ[oU4^"`!-AmU($`F}+M-)K `n ڪ6<֣s=m-.X*]۴Uq$Yچj0hfU:9 6IҢGJ$Pp$JHu\mhT(RRP'ՓFCkׄ bL _&z7l}IeFI*vàDy آsE6G=TʍڪP[^O:(Jj]d*2(eFAjЁF-HOOdPJ;HUߴYaKl+TOۉ%6b]MA57Z`' +U@ &eCE5FHmr3u#zށ U @ +IUE :ƀ6mӺ`faƕ݁&zPIQ{/AFL(X*-lF]Ρ~]뜼oY%sP׮BT4|ƛւ LW YU>ڢ  '` aвM̀| =peRt_HlfhJ7a#U2'IkOV(T %׆&I@ b~ q!X7i̦rf4DX%d,Nj+) RnAKݿre}ᢶٮ!tts/_83?y!9.oH3/W{Ӯ/H/~",<"µnroqsWf [II-}P2?J乑9#=||׉8H?z'@YN v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b':,q98Vfv(؏@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N 眜@QX\kgZo )!@8b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vd@9͌@>' a.N EPjNSt)Z-N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'8_Zݟ:W?-TGǦtuZ#a(;.ЕfԮ:# n.tskd:IRF/[fCWٹ (LW'HW +a椮XJ3"j6~\p")-{fCW׆3m+ "0] ]!>%n_U61zIw檞ws78 ],x7Oolj}QV2?[:?" (`z|dכ3ԏ00 a(ѕ9 էv 5\ؖƣ+6tEJy]`gCWlڣ+B" =[b6t9j.tEh:v"{(d:2AKfDW :v"2΅NW2]"]Y<".=\3)mWpy=9;Kx͇ siB돾(xizQ/POEWtZ돝e;HW>H-CWE ]Z=5UN9+r> 8s+BkLW'HW1Fe ]\5";]JsW_]ُz{F,ǧktu\Ca(Ñ7ЕeԮW)-fDWE ]\BWC"1<8 _-5]\3"{+LWHWF*kT gCWlЪptE(d:A:fDW샘 ]8+ T$Imz?)Il.FyNb6zo7-:|h]v@;L )|6 :;yFˑ{ȷ; (gxLT;H*L"KuÙ{#|upN?h[X_r ɟ7+]wET :l+K ;./N?_xzjpDo\*l~( dmBFRS=vnR27*7ǯn./޽nBџw:?^ߴkT^~Y7c465L/~ptyN(' 4ב`ς }f?-~7߂as?v}iŰ<Z[ 5ĢtVZWQdNRErmBhK6ȋW&%ڼkgZ7ϖƜe2)&zFR]"i׳ Ms 9  f>[Mr^),]0{Wp((_-{] ݽ.C-@'l^wQC)iћ+dz<2D[o7w+:}Zܝ珺jwi\4(r T3}M)F|ɄK~7EޤgDw'TV֪6݄ 4UU۬MoF5/+,n=bg>6rK#v\Efb193D AqPaD׉\@ɥ&N̳R DȝADVCkc)CΘ@"ƒ:Α|o_5:zPk1, DD*Ԟ;2Af!܂IMbWkᙸ@,pBP!ߘM2n4 |p9$R6t$IK *d..P3g3"b'~{?nFٟjA\q㮸0W|m[mxw}sA',e<-&DIdr&1# {KH}).1vGfg" o$mxFMcL0*ffWo,(M "$) gn#Bc EOυ:#sa>`#Ã*GcB2b&*c9/|LVGU#d)PZJ6DP.{XGs(TpswӢXà5&բ IoU۱kh~wm]?*'4Iz j?ǽF1M&{0Za yp3}5ÿZ0j fWG|)}#J/.{#;:S*.H?X;Φp=Nq38Q3޼ԣ5Ok3lg) 6).aah8Ir)5$4J9x$ɪ !U6A'BRܙ Ig/!)Q-NsDk})wJ3Lg,hp%[ME Ņ2$O?2+ i!32ѕV30lxLwS;awS I!`jbwCۤ5A is~'J5}X'&&nye9h-ċ,Ie)D.FEt'-fڦj8yoS`ɰ D{ PY9] "$d=+k ?)6JmCzgɒbIkB.zi\NX\K,`bN8 ['n}}k'UCU9oqFcReT@HsʠRngFǥ ,dQ3 =N.-x巵`G_*=Π;܁Fڮ܁8 l1'۝: _' '^vG:tg۳X txZ](9\~i<zy׾Chyx)9U좟n.ȃ9#S!9`{e@/dK2E3 t6V׆vZ6A ]9iopxx S#=puoF~| !M9S=nr-c}pv'Qc6acd˖')@O1Np6o oG1.gxPQ+zT ֿO4yԇ]\[80C6@(UzTo nBkאtT`S֗:җ<e58R51Co($x:gUɑFKgeNLJq_*jx7jI[yګY?JasAzeDyGh {+/()bBZ$.84J%)7Hae'ƳI MV{T9Y)B|3Ȃ3t#lShg4#m_")RFB# :+a$D/gR`NӸ?5Rj%xL,klYqU6J(J29ι},X&8x*&@bEm}Niī)Qxz* +w) p{':oKXoR{#KM9ew/'U]SPh+K吠r' N\`K:-mIc XR *ó4R\QrG+O-QۨL M-c=Y "I.C>J!6s9xt0%YZ=qZ|r @/ J3'wfz~z"Wվ)XX . ~kN&n!<HGô4I\)zП~QN 78k(TR)p+PUzQ9/]ŸDL')oY=0bYzkEPMMR>Nzٽ=?ϼ} 7~6w$C4FU`#oP |$0Q4b9 ^ jt_Wx꼽4h&Xm=v6Yfm{iMB<>|)_ԔXz6ylZbAa*KGtwr8$wwYGvK̨q m:"mTc_̚]^?8bz@b.[M+j̿wOYOtjs8l:z|׍wu9獖a2oywz< ㎲:^ToKvƬYr˦-ۼ)}䉣}[Hs\dA*a2`W .2}fб/y}|Ɠ3ڌ]I}eآ|);{Q XҊ3׈VKcq..iۃ2G21*KSR3KPiT,qp4zJ sR Wt Fi4O~ "yVyXYڱbe%9&+" ,i90!dLH31em5:dE/⍇s}xF$ܳ'xenh_ɸ4Iɾzlzo8{P9^#`u+ j \_]5wiE;^) @(&0aQRcxt=#}E!0./̢VVۘ3)l9(9X2,_+g*>dpħn&{-Azzf(b7DSjIF$Nzc\`!!Αl˜0&'spZ"5u< w:zWѓA]. PT4ʮku6g}|\r[ս/.1abv vbYei}QF}?&D_}Tt`BB.T\)ѫ*(0a!6!:7Q}=„ԉEBhL)5P+AR z dZl ]ĝN"'GZR'2uj+oH0LFhg9ba.runK`1S/[l=رL0eVG, IqSNd0åfVF!aR]ꑐIQXC^6@N&q2Ev$rh.`9|z,8KHk=Җ稳bzl $KV2Y !!cZ&R6N !S ?/&J`L SH,Gj9,>s yr&x}an$oLqɐ^%l6=,:&Efk%$$SH,fAo=*w'yZm_s&Zc߮{?+OK){3,nk [~38cã|u7/z7N.?`oi| Z7 u?4H,^iɃ#Sd@r՘]W~:0|H^ŽogzEt##c-m#wGZAeK!gew'L=O+7iZ4OxqFke%*ӈ q] Ea5 zGzN~Yè23֌{ŵFwwŅ7٧d`nA7󕕰^ H{7k }OķZ!)ڞ4/gIJ~mcDc$$e֣F>}z\lЭO޵m\B"uM$(izW]lA=Cr_V#(j99s^^W/k׻ *9,o }Vb/ \̛cq顲x5Kɬ\ ,?~͛O_㻟cL񫿼:װF`\AF$`\^}%5ԆtEηn|_m#}W2(aZ[3o>=L*_?u#Z6{fc ;rW2p#j8,E&DƑn6D}lku+(O\GBbp:Rt<ʃM AAz1QGzÆ5||uM)ax`S<֑"8j0r){lhbө˷mstS]گto9x}N=vN /۷;o7ytO 'ۑbw]!ߎ1t8~V^캐)JHD4)#Jm)) nF<Q%c*xʞs GVN 2XG,Q?*52ҹq'|Tk4CTH1 &ZSeP8Js2'IP'9OVV)4FƐ:K@V$J)*wlܖA2(Fi`C b \T瀱Xx@RC,, nc#0M]Ǟa0#p'r1R az3K2yĿOf,A/ 5+;j39`N0D-Totߜ|ifq̨eSs9FuĒzJf. kDOKz"9SaOS82DJ'&jOs<;ص2"lSkϕ!<%x$ %[PEc@؊{-#уLaA=2[!-ɭe#熊ٵݧpө}O#. #Yc^ r uÆ!(pR6V(o ?&)Q;E3OȠ|0a&r")_.r3jx+X ҳ>>9=^mJ%wP_f\a1';e wu>Ih^v7CZ-4!%VJ#5@V̧6EPjDIC-y3? %ZM@H1 .ԖxS`AʍُdlXmezAaᚢn˳$w#6krmiPti:y\dBO,pS $A rS P?$p"b5.PE#aHʞQk % 6N0+$nx ӁiAR;9g~K jg[6FyQ}ވڣە;QtHr҃H* iUY>D2% ܓxᘁ !fE;$` PcH9됱>?TirhF~$.::* `Ais/-i8 IfCD8}b_08if<=@:MGj>{j >ES|HޏRp${?YT OӔ Ğ|\#a9!/YX+B0*FWǧzd}ЈݓACީ:!AZ"XD )ZpfVy/ 2"Ý-\yNcBi&x9jPŢ`;X+\$JGܩk_v.wOIlrDŪ|xUL-Zt@:aYo* H"1gV*S=GCuaֲRْ1K/l]vv6z'l{k*>M'٬[n8*M CճVδB pM Iʬ;ʝbT %FA$4b)ϛ0bQG"`"RSFtJbH8LD:p-qbޔ$_L'܋ NORǗ/&Z~eӡTeLUsɄ>BO[$4H;EBT9y*I=_Zn #0<0RIc䜐<  PQR$S$J1(AG{m^M!=ގp5=qp5}@w'Y5];i3QwPW˶I=X!))D gtxMJǒˠ!h.X .{[IOy4 VhyϦxՕx[l 'Wg5Oi3̾Ty)LO՚c`y §t1Q_wz1Sw6]771oܔ\VWmwuͨm85m,E\H *7Ӊ;ߺyr9/Φ2xcfZԎEfv >|" G]8yq>WGD@E'~+>jEH'&9kMVgU(&ڙZOO""KÕvzB,9w[eu\Soq7qksku8ӥxӭ>u5]S vfj,9 Poj;O)Mnj׀Gzm WŻ0KӊM4u9`T?uԼN 6qi|ѕ &Aoj7krcPU=I\1j/2ѵ M Sm>? P{t2~iݮcMKgE$Lf`*NVFk(jUϭf՞g7]-cWm{&, z'v0hoƼP5D5,*=5tRGz=VR9N2ƓCOj D7u_^lG](u2Ae9&4W*wQ\aP`%y/CC(, O +'ih h4 ԔLXcGzLclFM08)8z[_JŅAL~\ u4qr;XU2Ϸ-,/FEv_5?&"yrcvp,c0VP5;5UTsc%bD9HysQ bV ]T^eZ(]ܟOqu6n IKh$CmzQw (^4tM}37fDohW/s3.225֡ ^7Q1|Pbk&r%bBW F17zQ4~tGۑ+nGWR~\zLl@pW\5j)}D%##\=A"T!!W@0b0pmr JG)W `A`*P*ژjp1N f  Rp̙ێXŮ]emƞ;*^mYBk+hjcdG!Z7◟_=/m^hNar{xq B?CB/La!D`:&\0Ub(0 {CrT$LĐL @0F|0pt(p%pr'W3.skn9WA%5 Ƨx@S͋l'~`0jjW$g7=ps(ʕ\ZU ᪅2R`S`库CU U.gF\YE! kXk3BP^ tN>QsW R|bri͊5宊 Jw:TYڀMt \\ @-#BM{3Yp4NEW( gpr rt\ʦ;\= VYB:++BF4W᪅kPf ֽZN+TIuJRe2iO$50- (YM>TT~i| (_R*d+S.k3 lH&LЄk_ 1 uY])&]]aCy_F= B҃ilk~dzيw06m mk\xHKDR!!e"2o:#0 YYbad@/BbZ0dPM)+LBr%M}ȷͳx'=Q5=NCȬ+|u*!.X!a+q+P5O6B@-Pn$w);P3PT[b*9R -L-(I9+\brtW(q*y7-#̝i1(;+T+?-Unqqe,8+l#O·+TFUf=~JSPvW(hWpj i`U.z929MoQ,B3L;|W䲚S+jT ]2N5=#sP =鸘Z+P)pB\q" ϓ*&WWpj5i:PZ+A[vM Qj`UWmĕZ$eD]\C]u@-Wm36v6?y 0TKpHƏ_nE%!L`Ƥ3F™A0U١0P R\B*ۈ+èua PҮ Һ-w962!K9;;9;Ntj~TWRt8^lN>ԥ-"XZ|brY͸*VԴVJٰܕ-+TSf O,&T{15WW-J!\`K3B+T+Tٴ:\= 8p)uW(;+T[BZwj!T+]\f]kiw1ڈ+<$ӟ!_m2Mw0 ~2?Loߓ *r#&u׻G{WR+bLq$6da: N6_87\ϗw&Y,gObثt+=3R W j-} [jcu>LfW^9tv:}_9O }aT vޔb {(-^?k'JBgIǸ)ҫF|/<(x2 yjbkA$oTVF`S!ğbopW}'?oӍl+*jQWx_ۺ&~t`?;B]G˛؛K/[\sx|}Es&9y Gӏ+Be]<9§P SXٷkM<9if_ΜhXE̒;r(@}J2qtjr &v(s 3,=W]ɜZG ܸ˜0ssgfN53=7=cwTtPy~'mSGii´ cU0*ӖӸʀ_'I42MNы:p/q m6TƏRU1GOf}rmmrQK2-~>`yut:V 8u/n4s'qxav(T|,~:&ϋ$6)\y L7 PP(fc?CFd2UF`Eu T=_(FyځZ][0;-靧;"gNyE{1cE'y,0臏 Ae ?ToӄE.;ം`;XnvZkE]9x}j5<2zR+>dz`(XKfy}-Xu{rLB3[ZUF`Gq8n9bL'0goGynq> j.0+:[pďS/6}B@' TE\4TDcp,Ixy]g}6XJ`u?iv?WǙ>'Dė<#viuZώpeάB;J9-#V:+lj&\uT+TidHNqW \\Ltjh:PpF\Yqi#3,;;\Cwղod*y7|1$)i}0XD"DԼMlA^\S[י!Uf\QWk;\bz!\`C3BB+T[Uj᪅bJ%/&:]\E]5PUZ+,#!\sW* PQMtj!fT{1\(gprrWV7Wt6JrB3{Z4l - 02ǜ[ard "֒^1M4D"1*50BL+! ɝμE@U q n|\` Zgpjyfd 2F(Q;8P t\J;\Wh\ ڷ/(:+PkXRW/W4iXYU1誐\Ck[L-iY1ai\WZ Rp>I P Zt\JCtY[ip  2 PmM;QT);\WXՀ7kNtr9sWV6WRW-ĕ V gpj9%MlDWτ+%3DW$OR{` o` xG2YKI)d2),jL;bљUu&a6^lf mI9PQ->g3xnNX2 z:A/bZB#ߎC`$(#**cNXgR,ʝsW]=sv;:ce l=zvȹ:=`Df6#[=5/06_omջ7*3LҨ؎h$b4Q0ҡA}r9 @a^knցhc=pWҏ4|K3Xy2_}/o`q$)?!F p-jrl?Ѡ?]DޗK3b<jk_#"HiÈ᫏=6w<(zLLް'5`zU_Z.]܊*CY-,; Q.dPl|V UnFVZZ+CҌƜB4Wt ۈ+øQ!\`+3B\&mTɻYmĕeڭ)(X3JgZa'/v*+O0gWtUH0u/~.&b WT6m0 u: E\8+++B5WWmY@0=wULҮ u6P1tj!Hp%sW(WWpj5k|t*3WB5`Bp'Buk]Lm+Ti`q%9L6wz,5JBi-xֳkb Quf j%md)0Kq|PNw)&rxG hwɋq (| |ڒE--1\LbJZAC9.b|P2^*̂ڬr=yfLWttv2뿏fbG걜ʞKFTJ 1em/N經ɬ=T(Q\&2e*&x&蠽p1C%;t(t1~MrpY(+t[RK4r7[M9okwy=^,׳0&'E%/f0Nh0/\[jJ$W9Qii\ .}qyȯt'جjMa>[ę~/oqT'{De^YQ"ӁJȡPH_GuT2NU[h]Y+=.<ؾAܛ 9/Fbm4ނ/%lr-7'Ҫbm_YlZ͎7e6x]"360i4?-K6'Z ^=JMOj9$bR9T2Ishy 6.gi.Gb5=hۮ;^ir~xnG*A'%*$cJ*HD)ZwBE)?@6e'NRED)&*QhhS^x مѺڝU!γLiIH|KIRV*)@8z枅|ovY=xBΨD[2OTZT AJU#JsG=KD/OIF'Fn2~zPBQoܺԃ uڶʃ.a3?u]|wW6(fsnRJ^oducL?eZ.Ϳ68#& mcן_& +9C~z>zjU}?lO7,tw@ ^0BtH+ i&iPk*BIeˌPNN8-~]Zߢ҂YX{TM'?{ZGgEf䪮c[ZxHE:'3*ȷJ> 'l( dߠ ^+Dw֥"HE'{KRI[?ā0"3JPJ2QRИ䢦R̎YfNB*Fp2q?#ڨ>٩ $%9xxI a0CAE q2M BZ$NDgIB(P0@ $:1F谱 %7REe3A?')3ߩd9(sN'!!BsN+,ApYA!#7ffґ` || q.D(e7CA|n~s&%1%aƬ.taNY(h&$lL0hIe7d*VBD3NB+pA&NRV2FY-)3ScBN|aMX'́8y E8D9)aBNBf<݂NRV8#pzkgVp\=alo)Ľ/N$>{hØ;&E=Wk潫Žj>jգN݌hcxrY,1RYWh"i;u2.KFF5zd#UjgիzFW<׼!q=j\NZk5:SPL/ ﶓVQj?])\7^^ԶCs;mck/#5[?^@ #\SЌ/Mr7jUcdJJ&y*\La _վLi%rk)%I8vM8 MS92wvjgY+hloڱfɼtLL|Cջ=}qۗ͝tUW{o&{%вJ RE5R"T^ =y9:ᇏ9}c!ZL B8z?/Wkq}8&ftbc04 NBV":]tq,S"7)X- ZdٕJL3`eaL@?NPrFGSKlD6NR< 1tD;礖k7w+Ssd;2BC-(DMM09dcApDf B-U pK zD3 T0 c:c]RԲn#m؜䉐&Q@V'ɒxL`I4YA$C}3VeKm.p\M r6/Uw .q9EX ſS)Sч Z K1Դ`C'(FNjھ CB*6I.Dq1:I4`*N8 2ϴ애 4AÎn &zXo4V9B,c;)8k0 I":GrN,ce`ؕo sAYˎׄ@Sk3 i{<@HZFzEty_ 9ʫIɫ"R$)>+T" . c%T[kdmu /ϗ܇ץ֠qn ܏ W@aCt6)ʘָJ WSm<%Td~iQRTP\T.xl^g umVI3Ez ~޲ڷ[}9[o9_ڭ?>-J靑?~<]eϊV㡓kš*R{BU@(Q \eƘ(DE2X$3q/UWR=jŶ1՜t-b5e`eyDܱ|gcͦڶ<JҖ$&|utO 7Bog%-fL>ݦȬRL#&T $R)%FiQR4%"8ǘMR[GKȀ,FYdte;fU+C>pA6fx5I,jn\jK*k WO_m6?p¬?}̭6-<,vxA^<~`y34FS6]܎P#i zJM&ڢoG]?;w&juߙoL&D0Ngdc~!ugnWgR߇mtSл3PKڹph `A`9}˘gsTFƲpf#i/w&Ywb阈N/5g-G+@3@s.ĉ14qcln8,y%LX< 1ID :Pyi$ aZ0<4Ao|gn {N XvCsEzF7p$NيQ" QNR2֒I$jқ38SGA ($+NVFli.$ ƥ"Qk:}]aFۗ>GgُkwcLǭW}簙 .z*ܘJEE  qlgU'q2gXEP  !̝V/ 'd:i,^f夛<6><(q*dI"6s 2#TY,ث"8wSeNGL2b`,Xs 7CNXܘxxeKX %5q,xwИՠP:ReO"sk7!BLXeo3R[,{+ _BtQ.+ ;rb #l7gBc1|IXEdLaIb=HjLKL0Qk[ | >j3y<+k}i}+%Z녗Kxw:3p~0U\}LJt 2إȯlmuw4|iQW'Kk.5IkC"{6L,^<H>0̾0֓ݜ}׊oN ޏɎ)'\ͽZ2j]{o_bK7μ.ϖ^RŵWYU^<—3Lo8HT{P E Htb-χ^,r=%Y׮TyDp^C%sQfuw+ɀ !g,[̖^$e}g"y8{<8 mI.ɸyOP喃9=T!Y"y&Z "o.||)&{=ģx\]"8Џ.?FdT`t 8z0KBR|Lɖ"z~޽3 Tw13hmvOi &ҋtudṘ|_/ꢓe:p#GmnQ+5[Ú6('Cl L{@|) Cʪs NW Jvޔfj;xr+xLe,Y_U^9~R,N <%}#&(Ű74oB]A&1y$GW&%\bq 䅂Sw[C#".}ʩ=ْϾ q`ǦOKa~!ß D{8M+.(g{XqG꡺n&K/Tbd_JW N/jgO$Zo%KG:BCaAb,઺HweugCl2OU"UD)*'(؅J=]-"bMβ\X_HX&e\*ފ -T]52k/ZۺƭRU:GUK8~'r[(QP.8"T7i\BJ2!?/,BȑfBރA%2',;hO6%U(f-?G" Ow\s=¦1N<9zv|Cx>{j}WW֬_ݴ·3y?'ibwݕN˧5r^kbYܔP夤4ay<1L儕Ց=oYZje 6}V³g~@A;Y2g~IѰ_z3v-s׹m X/K:ЇV=J).d->{Zٮ9F^cy-]8[.Ig4ʂN?̣#;[CCywнyUxi C ٻFrW~b6a33yX$HfEUM|sJ-JXbUIh̴es +0I }d\E[v 5<ܥ1GhQ9IaV kUv;\݉KCEGQn o >gpۻKҔ9=vb|c:|4Z(vۙ%]<եTO5Aۄ\L_\;+g[QDbk4{|9ղjvM{1N(H>qfLbה,i9{5%͟  W [ˡ| T6TEXXQ |[R]N <(lp ZJ5g2ym89)`(ᣪnݿpXU18>Iwx}+v:RsPj#QoNrݷ#D Vl!AVûʷъS2aP7Z}m{B<iy*ZO'ZS&963$3ک+QA(acqR$дZԺ+~p՗-WpvRD]'ofEp)V|^)GjGs/A/cZNpѩ/^WJj*Eɾ F4߰İ"PM2O`s)ʉ9y}tHm!EJt뗠^IksG':Mz>QȾ l$gu[ hٞnX97äYҟIhTae# AUWVbZW7?@BJ!QF|r'\" |%0THuRxlu7?@Bz.MgꚛiǷ`:yiDYqFB[9Na88X͙r_&ѵ4Hth0,`V*8*A@L sM1fhm7vv PUBV1"O:{HW]1K<*;0H=d@" !6`U& - : 9`ueݴ)A%6?/!Ra}xUYYm e^('U+uև~wp!SevXqWmNV;i'z Xb19Ro3tIg9%dN4!oO8r %w@P JG~t}?!__JЮeU=G6r26`UCzR#)aqkU26"`N[RgbEk-b;%^n E߇l..IJxˈmzRޢzt"Tru`\#Jϐp{1B{r6RA&@r' )᮲j"ԂKEMJ RNy[3sG xfxZe=R{,a dH;kT6jCo֯p =wk#_ .g8b7sD) 4>޿ԡW6ہ֘ӻQh3W3tSqg NQ>@T|'hՓ}AEhWuVBx)D Cߣ;/I)Ч "o!P2/jK{x:Q~ƩuS_fxĬ㈺JO8, }.O7ʻ䤯^=b AW># 9=Ͻ t;h9&Q bqrVʡ6d6?L`̏v}-l.71ʁ@U4jh@`frTP^P674dĕ?8Vqn{s9}x{ Zji!(=+Ըȼ@ 4',!ePH c#|n8J,6C`Y1gǕ5ZHɶӥ몏ΖV+oz04bբD2ziIu TpWպsy]Xˁ+AޗvQ(1kk3].~s!'kӓrDդ5I֗wiY{H|%cl4N6_TR+&' ȿ#*c !#y 5ϥU|r%ghIWI :Fo Ja,Y?MMV}?M:VJjU1YҪLvr ^Afԝz>-D7+"-B!Z6Ufzӫjy@<\}W6 ?ܧ{ NZQ 8{^v* %ϫS\qskNŢidӖ?3u7QМWC<8+>wOׇH0ā?:o SȹƙtB.ܢDSI%v>/Ҫ9aR6}r &*mV]#Leتlf7O7~^dU~̩('YwiŎXh'ٗoLy8jȴCɨ7:nZO+q5!o2JZvQ`E!ٔi~ ^ТvRTrf>lhҷ]GZ%7wv3dU|=yySIf^m"V7 Dck_1_/+aXH%qh0 x}E#ZJ@{W5ݧo[;J 1ؚ/Ru4Zduv[y"/⧋ ,>as̗17uq1.Œr׿:K vDe7H_ lL9ʻb>ev-)ЫS&2A 8Wpv{?H B65M_{Zɮ(pEF{]U\\ZD_ s]ۄB0aU̗KB$8߉h8\ef6{5ߓ#<*SiI?Zf(2"z#tOT`zQiՔ7WcVnא cGytv Je>w΂|B͹3)r;NN;vSW뀟c jUc$/< Iߊ N1,HH[U9lKL4Gє&a[8"pq,:k#? &@BU]_Gy稰ԢKqYc+;~xnwx:Č;fbRӦKT;HFgLj'FI̳'X&Pڋ} cww^:9JىMS09#TҴAC:3Qx~  %4܍8RB%A>K1;A*Qi@ULʍ1.it_KLi6\:6QeH$W2 Hܴ7[=q$feoѾ픉Ïj`"B8Fþrs,O<$l|l4'E^q4GW );j~ sVeTH~($(,bP 哹ר~-yU.$0a pXc0M;,oyޛީϑ2Nrkѕ)(lY^vR ;}ӢLʤs?` L׼bR= }d>*}q(#|;W:5]GXt2\=Uwɠysn;1`"d3i K%*nGGZ$MUQշqB OZo;{jaj9gKmV Io+@Sndڪq_AMG]T *cXUB!k;EVv<`ECbVrHԗ*VQA含bN\1#rBT$rK!k T^`oj #զy$RBXBƨq;4~q2ן{`UaVjj&v! `@+Y\+mC#$R\Cn ,Z;`[\e )!L[pu͆uM#5E2NkG4…2RL,6q'1y(Q>Y{=Ϊ>Y4xR|&&sHzlIl(&2*LplADk\i2C51*pN]K=u=i Z$=gW|^|53`ui7‘v&axÒ(z7 PIf1sd B VDpX/[ŧ,bY}Cr[gk|! Wt"BVCf Nj,TV[-Bd1 3yцDTڐ`Еу#Lo:p݇zCCd92.5b\x~R A{}{F-A-?KQ ŝZd/03_vͥLɎVdn;c)yMˠ˺2[.cUC3%Xu>Eά EѶil'cVưn&CoT`nшAi:([%hH̉Ѹ MRb P~zs6Dt40X`. ͸M|=Ύ-u0 0ڑv}%kW"Hv)!卹T~; eܠQBf4F$3C+p˵L|LM(cR4x]/Cc&G/-;Ȩ}팢(||w:⳨aipe͊EGڄºz!2Z$ "q w'c pEы^TZ(Ttօ˯7y+$0Q%K'7&}ו 4l9nU yH6 %hڏI5P8VMƓfjp\'JLI+Ao<),q-*(gcd5  EJPtןyM8>! [wI\#c '?p `+Hdt$BZu_!DMtU,gQ:jU;J #cQrjd\spW! E0G>&gǶ}:Mj(R(DśB(^$+QhyZvX΀bZ`Qq<@s;[S fFY(+%RJzD08." _<{ ':amL=O Am>Do!FīeȀ7uOY9'4~%!,e+^}.*"&ok'xo@(hiyfMƵCg0o|2!X*H+2]T#JAlg_c2##"))t>1QwakhsbY_9g^' NzE('JiFf;<vʟZU \ NGezñ Q1֤B)h$_C9}#@zAU0sa&J=Y2~C2E&*<Q>V)*mSP;x)֐ëڠ=3z Xm@.ByOQԅ[jd\]Cӭ>[שעc-j ^aT\Gga [J{f<{:>h52z89G&N!VcAƄFPd3NH5qܷ#:(oeg;BȾRvvhKH24f狌44cpCl@bE'"ԕywyI2v -:֥T>B2Ztݥ{WL]FJ |gY} '45JUW9^cwZM we3{ S3EZP 3=/e#@*:0t8=Ewр>bF%p,[vOlaT\5_ܗƥ?jp8n52q#X,tbqxh-'i%B h%cSld9s'sh?oEZʻo"7'84}\ާ$jzO^/?`ǷZE՟WDE>!8#ʅ40F >Uj0LC^dǺ}>&͊#=wVM%!DhfDY&H%!XfIi|"H xxe#N)ɻ?aL۞ТXcO밼 "8$t P#ژt)F{gD{z(902R@%Lf믬Iw26am~yjD򎐳C ` K @%P6યDiucv; {0e-X>$ |vROWdl]_bÝErI[-™V% rc^Fy? 3U7lQoA2@-E;S0<爧"9X\+a,-^R,-1 q[^J>AFB[yP'4` H or0&B^Rt%\NjД6^l*J?*A_zo!e- r3|6z~'fmJ91';eJ2x|/M$|,Op9Sފ0-n+߇0r'SY7ekM2$O+ 2d+}qaaf*571q Yo]7 :NxT]Y>{~]D}6g[Y٩+J9p rX {i}+/Ue z7viܧ#^8v4x Χ/z5i^#ȩղ[ 5^wPd%ZZp2dQk%Y=2/j פS/QF ZAʉr4_ )IVeT j4$"tN8 /ZÈ\ç^O^.FNzgh'=H(ç Ŭ[C/dVt3' G|+30(4F$3p [!tb(Vzh>NF-lQ]!.Ͽ!x#RfGןk, fJ|#!8`'DCJ?QȢը52gx#f81_NNq:ˡo0}dXzKqكͱ= 8 ~E ;i:.d,͊Es;':j[C`2]+1ҢWk-ρ*eXzhڏ"[(vU$ʴS&H;w.E!ɬ&SR!26P'%srޢb#9Wv;T_ϼd152D0}3FbWgz:i{1F3T]ϰH{DqY$$ĺKI!ANAam՞heⱌʄJ'Rhs3qU1Z"o}S1ާd,5*1|Z.8L_z;t*?11PN& lΨA52?=(ǺJ`O9+OnM/;PQa&YQY%J0($_ROSJ/Frȹ(FO.R 7pe1q DzFv-F֤B)H #@e}vHgPy?;YKqg+(oupRd2BIJ0hX+f?*6x)ViY`*tXMKmwgX*EǗڶalsddٺdu b́)Ͳ&VcA$fL3Jjp%;'GQցÈc?ɷrE}%CRA]{8*00Hr203G~Em˶dS(e 3I"YbՏд%|~>[vbpZ8D(0aXNCU9aG)ON ~JPm"B~XB8"x 6^B~~\1) S@ Tᇋyv\@'F+D ?ޓX!1CBUV(x 9KIšgNѱp CSy'nGyzV.t<>O}|,9ʆrcXT("MK (%1(1:K GVP)ctmvձЁt ѻB|̏ƣ O{7ɑ꧊dByoډrEҊǿ44H1#]ỖᲭ"5Ҹ_^Ny%4xn=\R֥2QiK.N&莍TtƿҋFiޒ*u1PmJQBN8a&&TPȄD!*VGќGFD |{;ntE.%][ Hةp_mۇ/]0coA=nһPRCSqb5 pB^ Jٚ.Vox~ˬ'~/[E:f5=fuqkKg{VM x8{OYُniK* Ex1"UѮ܃ʥGj!6{Q[( G`zQS!.~،2"eJI{ HTiR˫FxQ¨i {N%4].trmߡ)/D94jUznZaS>%㉼U:_MBkX}u>-MHii .C%4ProLcOJ4('zMN k%?ERul~ɮD:-h8+oϮQjD9 *q}(ؓmAM:'NиLNR-^K^0rR;IPFNӛoqſRڙ NLJċS^5K~>}V0 14^NJ0t4Xxe4@60"N#=h4\hU4Ү2.l;Sz[IѷFQ8A6 ̉%4[/oq?M<ֹ(՘=WXFUsO'ٳD,;h jz>O\Jo3`U9 o'Ji3N:w0g`9L SіDZ0jMM2f{ܞsE5y:nHv%;K.JnX}1oaaáluFHSH (Dw,J5f(5㦎5->=6|4q8J6^,<;Q{udlĚ\hn[v$,S77HR0H$4O+ 2$H,M:}] vl \R젢ܦΉE4Pr@6urO7eם>\`[lS.rag;D]noeuS֘{ _RCC)`$R/3PPF1ӝnCy^`aL=c1L+si㖽L̎"S7X^y[ Cތ7w?EGɩU/N%@tl[ ,XD:6gQ?C:Iyk_ǰ+TS`S% ኈNb'98iO/|Wc^E:QpR} ME8υsҶ ~YSX9v0qN@؄|z@r+o!l*JjhSBQsn0JhQNoN8Z/[..!\ Gc(z.ȫ1 ^ZQ[ʚ֪*uI.,sO*cM[>]5QPPxci"T|Szq s4CQO.Qt0Ρ&;\4 LLvx4)0qEg:pǃ4~ޱm!d0lsryglY0_ع+ wu=&he_ŏITeblXj%Φd6Z+Z}p_^)\o7t~s}mAn$\[p> .۝ag:JdɄ"E/GZqn;$S˖u6Dwa?GZ|OpOgQw%7p x;+UVZ-x *SwSՑDi]sMx ?Iaujhooܝzrh]]mXs_'B9—_&>Zb Q[;N?Ō1qopJ07ڿghMNmE*/msPߏ] Ғ/u&ox㭚v2B\1yӔ$2FqdQĠ!k\iRKNN bl>w]l q8]\Z<  UV9t}V#&C\W-#0mt@؊)yح(B&7R!)&%$&\VfD@ꓚcނ#|wqZՔ ~h6y!}YgB.X 54m$V{vI}g\ ]6"o%d3Et{<_e t=cP뿢qVP8NJ௯_ܯ?l6|jh684Ie$ujL$%q,&q& DLc u34L6ww/ ,tW FV%tseJN#* %hf }Cyŵ,ѶVvxupxҰNa8ohk8eƏ%X&gA{o/\-Fk4 [#(#EuOw3BRRl2`!aB0A`DX.i }f&uU`>6oOٷW3iPnlS,3Vj6+˻;Tw$u=1=2\ݏ.e-yDvoAY)ϡ嫹cSc0@A$J4iSג͆$cC(+LcZ}$NT/"0;mDΞRޠ3#FHX~_*B/R^onC L#pE3[-0F`8Ԓ+}Զ{/,߾݋ %vDN,x]rc٩SY 4_f9OnZŽMPO|M,d+.޼`5GWv;1LSК† UU7lGW 6hNVr[&! IL9xPI"jRPY.V4)RQq`ІAЊ ):qJd(޵6r#0p[ Qp C>&{,ILpU˲ՒZr[?l6gHfU̹BrZC^9 gc>WihO2Rs z,rtsЦ %QG@f40L~1- vh!ӉI ^fǢB"vlBspCšɧhi `p-,bIӛy^ %R^5kUOm| 7mrz6v|䇺v-.&C( `"s*2ԃXYvTQ". *jo\nn!>O_bo$R Qdv㢱%1XA)8eC.QU^K谮Zئ5;\,ûCvC, z*fTHԙRڸ!f9,䛖:mJ`eݹ(rpB|6~H:]~|mIT\ ^ )x 8[s$ hsF(P5o|;C|^ogt1=vv+qjnRL ݒ'Q ^ѵݚԩ/hԺ{,^d+}bCZI*!\m6gQ|! a`|0!.[F^apGp|shEOcYѓ7ʑ)b5^ߡ ʷmGoc68NξFcC[pzp?$&RZa+07NcM}=Mc9 ~mQm}ڐːX'OP3@MJ5Hν`;}!4象/v+V6շ{f ٮCC(&6؄֢kmF q̎Q-kSnhBr%8m93dpT7ЖchQ)ߔC2`@I4rN^a5BV9D~wo=mOS)u+R\p=4 ʘ)c:ueqVX-*Q~"DUg?p.9vRiq!U®2`UZF%l!;6Sw~Nk): cO5nbcCLi[70N53q/\|bbo#&@Y=M4+9g3btDp%sixXvl^pdRRCIق"Oo1-L7Ӌ'1q#1l'?|28e @@݉QIhᬌٿ|18#7yʿ7_$Hws; T\2Hpum*ay=y.%mBe/90_4nbL }R~={2H.}Gy&) :h1sL^{Km}}~yY=:{)BwWn]Ce[ ޒlX42NI|Z7BD%gӅ: >=!~{Aۓ_dw:l"hc \C,8=pvx`աO:MljJXPSGZ #Q}#iyRƑv#`V<2`;h'093^x vFwװMA XB8qn#}YBg5ݎ %:ږRk`W rrR m2,$FO?&KqiR/j8~E Vt6]a\4znxyڊgYѴSA+XfCFHH2tdhǤ :*% ژl" SN'L!EJb"DV͔ ,hAg2pJyldn=E%(DyL9oe17vxJGr .:/64'LJ p23GVb[L=ؔ'dT!O ØDp E@d%+^vgj{gj@)BQD9VیD5d8N]fҦduB;똇ڜ7Dy=:QE'L,<˸8QOo*BܭCSgUFtN gaP7ƔEQZnҎIZ'my8ԍ ucEpb]tu*-QQ ٠YdYbnAdiϲ8/YZ}+Q_M.˱:t;j ,lӓBc/v2;{c<+Rhٷit%ˎJvx%qw4Hg),gfN$_cr/KP*./F5VdKh#B5ssSAXh 3=S)C&5K(VN{CwrrUB)5B􈬏2<> ׫yPR&i Z{a%Fa;hl &i^țN8%ա7Il$)98b cQ*/A~GEP*FG10:2Gn#s.tEBn?^3&NV3I!up[ 2 E (`%vLxIƢGP0%}[w36s؃f,5Yk&eCcTI_&Cdw;juȶ+s W\Z*)U&LǯXL 9]AcM.jP9p=+VZI mLq},9 Lv40Q mNq!cpm\ߏ&_Oe,%2'˲,฀bVYX0}S;N3}*JjPIC +Cݩ^j5 l΃ {7U.IBs&m C&%2d<9dhh]x,V'h#q.?OSf.FOPl֠"RD91_+ȯV"Z9׻})Z\Ν6茘Y\JL1ȓ#'b29ຢdI4!%Tk!Dih76 };2]ո G0$3fc1(J76E1K6De @e;Jy~K(\GE:>X~8Bx1x;cY\3i$Ȯ>O/:V ]g^5!4e!  eTؒ:oEl]gC*ngiaT暜ϼ( $Ӕ*~#}L1AMJL FNjb #Ē49sd3t vXdGf<2kcy4hxiһKڷibwė~wϟM +2ar]YoG+_vIy=,lذÎAT^%ؤod죚xEHfuuEFDƨ(XNR[Ma=u\v1@@,l.ܱN3vƮp K9ăV$iB*5tsRHgG/uk*05m[Xޔ;o<׬^s61/CY-rȾ| m[pOѾ'x1D&%^B^56ܛuh2\O ؅hHf~2K=8 @ N?: DB/D?_J#7eTx̓)I_]Oz ;.ET7k \|d 2 MN J*ۚ{C;:uZz8UzEK4iorTvI$ʱzDF;a&:ҦB1gUDXwh{9a~1C 6@ltm7RpƑse?;5<2s.WO5Yv ɺZױ>27ᴅkCh1F4.de⊏0ISG:|p1ΕrWc'm2كNX%m#"t 2|?Y"ϗ_aiSwAztǃٻpI^rȼd>raoYGL dZk},|wxQ HGiF0 kWp+Qk2Q8 kY4ko;y8' \4H:8sf}A}{\":v|]Db]څ\zh;hN(+4!X.W+]ֿPca?=ѸbРrcK7οh#092ZDdV\#픳>V*Bς1a_0\,vk:}`5ԉzdKF5辜Ê>R0eyɒWܶX>VDsٜ~\7.@ $Uɺ; q$@܌o0 3#CD>ܝsgV]et*l%A{-\zyY=DRΎ?1;{LM(1{L5;yA+Ǯ4+Ȯ5 PQjF 4IXtǍP |ߥV=\kxB*EQvq# Fhj]8G=Ņ{@Vvo8~˗6]| 턓{# Y!!(Gx%N b#L0@bs0j"ۖnq< W֮4Ġ }<^-W!]\ǚI" %Eq!2ЊlR9gY1 Ir`́nv2Թ;xWYm bmpyS%]ز'[6{$2Yn91/l#sg!ǡ}l cmcqzԓXU5s١n*9ӝ+i45Eps6 n12ݧ%UWa*;3U=:@w_,NaQc1zZ}=_ce]L ”Ǖg ؙ)FŔmabGudu3RYo'^RZ;)Lm rAw-vUbבՎڂ n7Fx0Ecpɣy3YF) aIAn,DRrۈ傮.DY@*w(LjR"k dCGaÈ8Ɏ)tQ C)!.$MFIcFe-3eОAA;{jiIIuy2&o߭ЧD1$֌sjJN]hT2I>dVb)aDO+1X F'o *zi50 p^R I,hA^f D0r6/C3 ϔr-M0={/&_$ ([ cd`|C If1N iIBfiٔn̢>&S 9 \E@ k)G2i)Hh8 ]!I7ŌȄ5L19;.b\|1}eB*qZL7Ǒ[ Աo=&4FAj-^IE*!'(;!ኬLTVd^*M&P(:_I5AWR |ƌ'GB9 t/)p WV ysdLZzXCjHfч$[ɪA.4<7>M6 "Hk]?֡aK6g103n sMBz@HBEnQpNTV nB)tMD㩠:?>%s_^4 q I ܳnVW?u ny%7^?޼/?,26CQo'H04魥a& wBHpȫ}T;g;'g'ݿqF_2ձâT\Z Z1?uomЀAS;:_qf,$Q~]kNsB;l}zc VĜ$H'YsulUUB IQQJk7Wzу%6[B/z%d[Ƣj{Z,=u / =(1g+4:++SC% zXj y~ }(3#nVlvm3Mr햐mm pjRp!+}$ a VQPH>pj[m \j E):9ˆia p*5Kl8ntI).^!/#wF)یM=b9)J VJ@R-TuMxr1MMqI|,śj9pQ h,KhlQv03 !ٷRU'UZZ攈,)-'9W1P&zY> , օʉ(BO6BX4ƑvLbbO`'E5V 麳RL uҼ^7꫋G;|7{yy>9:<@ {v:|˷cTIswV9*C?Kr?HjB|#aÏ۫!GqsKuںNmβ^rAG$T[ddž~f%MȊx?cCql,w;"q+px5`z'vU~&#S^dt`!E*qc"aɔsJ2EGN!H.oTCr4Q#yUK5$c:TA$2W!hblJY V:GJChPZ`zXIdJqAeM%3J!`ْQ2wCHjiza䅽d,IP6Z"IiXYS!(1gE k~{{ w^)5w_y6'0L mapQ -!,g3| L汜; NT)\j]w*Yyv椪x,rY~6!(ͮ ϯsuݛq ԰BZp^,H,n*^Q/ Ճ8BI eh4a$ g5C$+#IEW81hFGimj_'KHvڿlԾ#FaK)QW3]P(ag}!Cژlf ԒR>6g mw֫ I^U,׆-]2WN!klP8 g￝1>'F4̔M9Kxu *a.6 JO6Mb]O7g!2'3@»97b1t΋*t{m/,9Y,bC XiԠ;Qiwޫ6Z/N%@`ɾ ([pN]V;$Ų-+5T3eqL*$x- y]>XX{2ڑcGP[8Tۑ/\ `qmhqjF݁JTKZh04}B%PεhVm/ w(w~ҮjF fkF f0I I0k\]V\{kXr.7-ŊRTo"sb Z<#ZέX1_Qv|=Vy[F T=^RvxA]4i͢|x)4 a)\ БyUt^%e&گ5[Cs:O]yaC"VJNٌujm{~|sF?R*phM Hn|0H?ɳcUK2q |UM?~cƞB8u;[#5U5&ܽ''䞔olw="6 T9VW\Jڕ8QDABJ&4*.1 c)ŧfj$GʀŞ>%H|_ь W]&61QA(U#ʵيYV3v$Qn 'iŖFqN!8eS>YpL5U\T]&s_4e23 ί72[5j=<@45Ӈ(iAkbqD>/B5D2gZl1YD|4 %bZr+>P._BI)8Sk,:TؕdԼ ZQR4}ʶFjU$}LsDR5.!&z(}Wu~y}פg2yE^c7zh]+^Fea+f`{b@H L2)USTDAWof tmc^MD3<͵? UfAys\ Gao0;:ޭ`bLnk W_|>C7[=zXNW3*~%72~88 lH)eoٜ(y.U' PUk}o^,QZ'bT -J!Z&ʶXlB E;C QVqI% !2XBe❜=$DIOBʪZEAb ǪcdߘCӰ9$7cA9 x>B*tB8gGMuqgM[ð5`ã!{}+ЩNT/3m_r♨RZcsX٫Iŝl>jQ17٨bQzWG_ o߼Q,[o.8oɫ/AO5#ھ<ݹ _oDu_]df_wjA[mqP|@tC| \Ԃ<JHLm| -ǞWUMEoI9Z*4N=kY"3!+3,Y)Eo )rbZW{Z#B+\ͣaP̃u]iՠIv{h51 oňyE>^|H+8n@{rNhSOm[poEpk=E׈Rs!||6ګiZ2L$@YgxvS\ LmqP +Yl\Z-jj}q1hP'~&uΤx`sA-pQ]bey fQ%6%<3V!(' }v3<-|nmQ"3xf18x' =(^e#[z9qX'0CB}`Lp}{z-0%Som5~_qL1[w@ѱ茄^9!XkT+皌tjOq mdKo;_oLT&D%~Iτ[ }Ǚ1*{;٪nía[0&rsCRRm(l;s 2GTj"s*W\bzFn[C)R6p@nҪ^-8h:&~_gEYqɮ!SXa$[Lm )!"*@ ~([+J}ɢ?hx16ͪ~GYuF[1N[nդR͘ܝ5MU;+@N 'DXĊAfUE(͠}H@|I bD=c! *dA H5%/6DyFȈ oCS xT~@?S!P 4Nw=Y0HqU*D,E2!cY TJTf]RE:箾T!d( It1eG;i.XPV!_'U`5Rs%Q)|{gl]U{ft<(=K*LE&3G/3SݹhcQP+24X^HNhIuQ<֜%b(Mcelr nSRբr#lќC Ƒz_SJgmUzȳ"v2TEqBģ_D gVfK iZq~Mh(hػCS20hhw#kGXb˳߿߫5WgH?F#'c./[j\.wZߗ#.ICr1:9zz3`Txo\O}zG)}x>Sbl ƫ?痑Gݖ3fbtКqHMXU~'YZfh#ܟv9W f]nRn!ߡ kv[^ocb=wvv8K $hUY x{7ˍx_zm]VVDtpW.}=^MԂx55=6w τb9JCF]^i*dE&4BP~1_iSdx*!QmJÌvSkc*ZX|' šC2_c]P)[[!J> {"KsCIOq,nvSbav8ꭋ^2^3.nvE_i{*[kh7s!on9[bh7%&\N}9ξb r}LhDnGKچsһL@^av ohhC;F`yIMZ6Ґh@nU03 .\JmؐP pR6ݤ2Z5/j@6Oeh!SuΕ2JM 1kdzulRf r{e+hHJQ¦D]>gWIv輨wRsO~*nŰYWz_r[ɓ ٻ8n*yIL[p+${(xyL =n"z6Y #?'>bssPÈ)1|mDT33'US2@S&؆NN™clC(|n~~"\6;rso~~Y鮡jw>Su1Q'pyicF%:"6뀿t%.Qx̬0=O<Ӊ<'k'*|2ܣ*:HڗIαal(P=!))WKdVdNŻ AN&}kE`EJQ8=ϺX7y.LI6gW ^V 8=TXya`:@g )S)~$LpPtF?k ^ A|˸@Mp"//`jj0GMK.~ }6gk-dS*=:P u#gu}"€,AZ$`B)!YsJ[a̡ }Z͙`QeKC) K;BEK oaů;^DAY۸f6nsjpE OøGuKT*% ާ 2S뀸v#u'5ZXF,ܹO  x. >N•e$nS ]TXo(4cիٙ5?9bno3sz1F?56#vs4ݿnfmּyfp^-ҁK `D|d5 ƢY6\I)[ǎo,q)*F@+}#&QEg羣=ȡZga8Kŧu'7rL1>︟*419u8)<ǀ1>}6yHBvŽzdg06#s7C60IN#gȫIP5׶C& QzO-h(*g|b9vڌxmǰ޻ǻMJ9Pwe&WX~e~JSm2B#Ճ܍C)[ I]JRچe_%)9(dU#Hb1 %OZHYhL 9B[ v'?A=|dvnL?9H>)yl9mFvp?~AyO11Xؽ΁oN(rH!ļo]^)AST$\)9󂃖^~\z_~'$eo.ZÕPD0cdV4T%s(YY1qK )zɛI 88} "=ӽc鹩eLۋLbiml=-qB1i}}}jEѽ~Q_ww 5Ys7o~ϕ9׎+G.& 2o8 ٷv"w 4#Q>@2 bc8>6oرU7y>>LHZE0R\- jȏAtݔ ~Ȭ\Sk`4̥CVf!K~pH>;eW Ԡ zTf4|W L mRL}ypew1+\sxϒg꿒-ڧ\9׵O׳}هey8wIiCk}|ߟ^r%>|&ѯMmћkś 2{ף8*[!"T)!d4 o˔&(B)sDI&޲?]KQY@^0Ĵ O{5YYK-/U2:lW0U4ܲ$q&~1~mof7V.ᱟ=Б˫x i'O$$DIN~71{]K⥨{L?dO^CA 'ڌIFJGn_F'h!9c8:L['cf}r2ȧS`ڴZMTpR *Sl~"j׮fwޱAK]m}]XMӶO> d=ke k{bWZ\fóJsOv!ISSٯO@-%QZA֖ڎOdVR2Z-A b$ ' bVkb́mq)*\ Ș{ b|j;1M?݇CǼ8bG'*?3n#ZF^?w m֋Kyqe2, wp}~Q}ĸt/ؿ 'VBϾ(2h>uaè.]96ҡK[i=@|k %7)LZ92I^fj!akc%as~GaTi%oyjٚ|}5#r yT̚?lYstQ) \Mj `RǠmָb 㻗6! %GjriT5S*pj!4cVLL(y D{M.SnafWn>XK9D`6''Sk nW7dRfVF \FOԳ{![<~6G&ܒLjoW"i6UӪ_sŘM8w4Jd!.{WWzʽ'1'sju39.b)Nsa0(/v]1&4*J 5 >5^^96rF&x%l Ҁ9%G}i4'a@)(l[uR5LkejDW^6EtobMz|_YDÇ޸C4TDYR4}@I/uz[[+Bp 3T%52ŰʄoVssF],n8%70iux|I鵛3ȕOin͉bɞG_%m[n%mq0(vXUjsFeAaBVNR] ^G變2EeuxV\U 0< W+E% g~M 6V*"L3oѧ7(/=m?T!ө_jkE$!BR Ϥ\ pDgV:^ϵ@r-8Cl۹F'"y OĴ n&QT6~qI)0D3ְ X`j^Wh#$|Bl/FKBI p 9"GdQ"wid]&b m".+ڒ@ َLeNN  d(ȱx/.56(RuDUrTV}T{EyK+WsB)ȵmq(v=o 2#@V@;e_Li!, F[Xh.sOȖ9Z E<0uAlkd͇ɒr"XZCޙkAu\9@S/Xɏ5ZbS'#kmEpG*'dvOȖ9Uh`i`!!+HwXbܿ<Y:wyO|R#[*֢C[VƼ[]i9u Yl1v!Oٴ(udӻ: sMڃ/v"Kc*X/]ZR \|I/[Ku+? E7P_y5@]s0F, mx{B̉d3 4W6)ˁ萡ƀ-"#yKf:S=2qJ" D%6FB"RQYIp gQT;H|ۘo7>A*]+$L؁Fg |ɢ*h@[4dh2񨠝[z]ig،!OAA=dG+4#ZwASc88a.#V.ٽLdx9&db%6'R+nȵT3ۤF߶>8ӫ۔9w˫ AÊ?F|dkg_߿q1bAOKGo@R˺Ro~F>M[5 iGglLbt LF$m -4*iw81_+ 1x4q-Xh>c1B=gv')\kͣG' aCE*ȜJǭAkJ]]D6D (ӆJcKa@;:Du=OxYS¬5چlơk? S:?a~y4 mUEǶbb` NuDvBjU(RyJWFLCRr .5Բʹ**D[ˍ9T"|-.hg1.Ƴ_!ߧk$8QpDt.2[,ID{g-cU P(,mA& X a1{Oj2v=`\ֺ!W*2pd22_A o*Vb=uKyC#G4DRaR >U>zP**ȕTU(Z^l C뫴+Y*ւAzQQƥTl t'Ɨ쳁WH皖~e U!bHGZFq> FkM4+i'M4* "ѵ]y ?=n'ltX &,^ߛچ\6ߨļyƆACmEfhֹ=1exŎPa  L`A8VHH`WTkAmحM$}70w(ޡt2&~AF _璩wUVnR+ACsK3//Tj=`7z<Ɣ)\;T{Ɋ ,3e`,|MW߱VX;/<1q8[^h/hƸ~e!Eb5-bq([w Bω\!KsAɆ"WF@,ͥ X Xg@wv(NwC^ILbCM|[!-KB̩0bI vտV#HA+[r V(Ms@E~#/<"9QՈpb@Eq P*6>LN}1T/ݐWc0+ʍh1LB5Nȭ {BϙҔ` \. D!Ã-DnS8iEЪ,\󵜅񓫆3ڸY7+nzu()[EKaB%tQdvr?J\dP͉TA^T9#:WʘMY+}thG-?&h'iŚ0\3`8N 6>\Ê\kE)EdGt zvؐsZEE_:Ē~OXNg{"Q!^<\+ 5;-qhD+v3\ݤ -rߎ^ۥoַC۱  ړv<<ɧz€Nl= X %9dlM ;ܲDT+҃?E'ZFuTx0>om%Ȕ Og->1$y+"FO-ZI5,Y3ڹ4ө)nד`XgȰ>$m/uO ٘kh9'.BFֈuHdz&;kZt~RLCLF$Yz4Ӣ YjmaW{Q_(ݞaѡyCry#yw F15H-hgZ+n,a$aCae˩,ܩM=yWIH:y oNmh-4rv2$I"}m@tK7{؋$jE"/m"~}qHᓄ ȼeS hW$R]}~7JTHP!5'렂PTz[JڵQp}|OHUHw _RoIWsPoi%:mc5uۓ]9å_](LIKt8 [͝ѿ=հȻ\`g4<mf)=Oͧa~[3Var.)ٞY'E3);ןgGWR/t9;eˢ>-b5 1qU2d%p2oSڹ]SH'g'm/t &J(ה~uN6d' q1ʵnZjX-a5vrM%,hطKO:\z}Og/RൣgF:LۧCAH1ZNGÜ2W[l5sJW J\ &^-*X tuzr retC !@Эj:_REi2v &P,͸=ƺ⚠nOϤ/ F ,c bR,jq07gVd m-W`\VŢgv9]w.of,27!y*-$F7UMoq\kNm^ֺFV:7)K0|=X'Jgּ@..g>)qߡG{C9O$ %CXQR yXd`bhÀM.7E'?{Wͥ//f1Cȧ4}1@-Rô&n$-sx%YWS$KR#!ysא2>hCbq M^А<γh;^6b{u73],]wUR;!$ .h$D%#?VEp2yRRjqvBl"%Q%xFgF @ "m*$4VX7hYj$PH5xNc<: ' ZH4>,:ILcdJ{QU j($C‡RAye5}"pG$8f 90.+!&-l C4Ʌ~t8+սYf=-*?A(mGǐ ؇fWLzDC˻<0i4hu=oq1 M4#9h1FsfBH$;ˬ + $(E)Y {9?A."ъwrITSTp{ɑ,ITA+6,$NOj([NΉrj~&ڽئY]]g6~qWq : v`Av`A7e?mi':$m#\HJeR2zJ(kN_l~nIvR,nK;P׾ ߏz\OЋp!cܾv5iy_j>^ޑ\-f_[ڗkgx2kW3LgN6m@]tep⧨, sc͡ؽORD0kWGL+)]׸I[f{y KC"ZdWZ6H\ (YzlEY\e?.MBA<Q]DK>h| LC8ds [ս2N:ǚlj\xOc:-CS|jVLgZտiT5H\@}XDMDy% geūՕKP]UՕ_ʳR^rZ(km,O$U{'Eɕۥ\]lJ=}0ETJQtBZ>O"=4PS*fgg69)B /eCieHJOzxHgY ^;Ø܉'K `hNWҔtƝxt9ʸU'(ƕ9z)nV/Ջj:ޞ[m=hAhA7Z"J;F"0Z&xG Д5r0́Ϭ1pg럓*uAmUjiFcr:HN!(}^hcN5E~/y9~;R|{5ˬً{\Pn5j4pUqQQZ#ZRrJQCJ1 8BN"q{sR̫4 #pfU*IlS0"jYʼeE- "֩VH?MfR^v봄ڵ\sm뭰JKl-* #mF] b rotWu:!%p픈FݖؚH+Y\IE=֪Ѭ[Fcn T#nFA^Y-Q%P <&LǘEDd{*FNFHEVO3 )G_w\ g3gQZt+f#8ݢ,W1 P{PcEO,׆S<8+SœFQ56B>Dr):@׎Dc2J(/ ۄk8#تWVtq3A: (MkY8TpقJ JFViXϚVBoF=(t8RzcbH-BLZD,,5JfP@"3< JOќ-"Dhl L%s#7ϢtYXDBR d% -Xt d4Tl_=M73fڄ 9 iOLQhii8f9TuZ2A%S^{pFxq)'FU` !blqwKli"H"AK5R*I΍0Tpy R [J9pKPO}4(Qy̒Gɫ=Z&@Ph=~%#"a\pǜ$v^gg؎%|$:yS!+"-0&![oyрV,ӝ~PYI LjaN*e{ ^eVqwv +>ͮ%12a^-7'->5}.ͧw}6O{OseGf{n~.mJ]7* h-O~,~I!/*"HxPGަBe#hsnyr {xzSCuRJ;5l >1D実D|~׿jWbꋜpQ72L[ Vݴz!rOrZ|Ib jB G ꨩRs#)d;`"oA6k_TX򍗬Qs26)w#Ȫr7kiIr.wCU!v e^>ۅy}GG:cEXRl*`O b?HR;X3n%7qw?mDDCRI%nCcA&( eIw-O1٢QNO(szDfe#&Py'7QHG3hL'U6jLkr{*,W#OxRz~W(y,^\P_L.Uoz״Q7x?wH,.U0v"Oe 6BxG\jf' O}N5hKͮw8ք.s%`[$m9hC=09$G"qHLu7ׂ@}yNjh9TE uEj&%ȆQJ Irp0"H"Žiu|EI@G)2 YRELlQ!lBK!$t+IBQfbz)[TƴV¢; ?VM)Q1(-B.bMZkR%_8szɾ M kRFiuzvL8tjD5CiesT`ot`V/;O[Sz2:& (M<64@mp=L b<PoP q9JO=*KJpivܕ,CTLwZ8:B +7׳%D 7"x |MgRNlPi<a^Q*f]T~\ޏ?wÿp.= aiHB4;4b=WֆQF$Hhz:.Edv}!R [pLj GxLI^+LP!5pɰFD-%kB\EM+lN.- s_xp5^w}! tlPdOE"4 v[󤀡ЩQtE&j]*xSEw*ny'4{0ڀ 0Ѡ[]ROxd>gr4j7Y?'i-6z_duIy<7Y (or|*%mZ:8Oۿo{.=Q%)m!NL Ӱ8AE!ن)]˂ߞS 611U7$0ͿXDxĒ:޼@ec! y &, WKӲV3 [>ȖAةL,H(PEKX4q"cYNW_x7^̚39*s֜wUp S)(,GMpNO\Ny;8FkPu3ډE?Fk ."u*6"c>XvXQuVTUց+`Eoq6O^̉,/Ӂʤtwy椻 W]wTj>gW{3R+X+{K+4MB%hc V YnYv9h;}YV+LGM^IkQEY&G!$⃱ZiCupm1r*cb%l(^(EX{?Ǎ[+|n5൝~kq^ c]4c;-3l4p&R&u X2|]}JpuJ;ϓjץ@wS‡`~!Bn}͓Z(J%Ad1X;adQC2*OPKUެH åpRڒ_;A2"Cr4 <}g *[:lt`[GJH~/LwEݗQDۄPX'SC1 2~+d{jxe\Gb5&@yj-$c;,X%*(8Bqs1pFʚ&=4M=Pz!x’Bxb!!H"Q!GJM=ʥ}0@QuK;JJJ9}QB{]߃]j]P0 ѬPT2mMıՈ}mҮ]ew>]qCyRv@03kN)ϑ*wlqYOacWDT_ʮZ/e}PZKv""kRʕܛZJN)tb^'' b j4fBcN 2\0G`su)wHHz]0[%8:nTx5' Յ["Һ+Y cx&0z-r+͉ `w+e0핃npXqeASw9!}kɳ@ȫMXo !B)++FyMBNA[+4J;sB@ߐ2R),2}D)S3<[D$fUI7{F !Qֽcֽ=}˸%vd,%Z>b&MZ[;6w[oχ7`*:vF2&26?_}xwq~2֬_ߦw-M-)E"ΒH5)ʣ[M/N+gͣ4=SChp'<ܻdn{07Q+2GY)55-<9ֿһz@;b#@Je=Va݉,:ŹSP!n~ UwOWn;B(uwx5 }QUsTUK$9*։X0sPD/ EF*AR "4} dRŻ'mj]m"k5ȗ=bSIԫ4be7-l5eneMH;KmcBU): V_dmm,[k^v/}'N&j㎒ZmaU!G8-aN[!4+ KG< 6`,ectnswpҬ"Qg2i~\aIʹeefcLskףE.hN x\;J\S% riɘqW2l6ǻ+A|0u[2~9փR1z.kb,Q3pU/ /t" yE ҎgBɺήɦ<=7N[煘ĉ<+I^ҵ+McFQr :O,|їLL~W-L f=zYeÙmK]8sXՁJ0q5ʩ**G-#E9P]%mÒݿ;LtltA@LOqL{]|#[.nG<9a?iOOnqlwO wN_~ K%\6c kQgY|z}&5NWIl^3~|zq~v: ,6(}Z\n0<ȸWޗN0?-׶:+*z濎#s^xA`<eɕF~ql(YO|&jTŕA]cځӅUf7 rooO4 ˳,bZoԡydxӢڅsFPa~˗aKPewp(c27@Fa oi秂 ⥟ϙ.x ?-bbMoO3*ze<?[;(IyL0i;쏠]α}:=oN"s.>u`3(fYtAPlFMO_'s]˕Ͼ)2L/,)Ɗw3*O?f!]Y>l}SX^2cu5OVl5[c1v?X{)ufyܝ=mMFme&w=%ޝ&𻓵'TW|wϻ=3-¤d}{FRh,4`%X%%&O%X%YŒ$.`DCg DD4 9N˒9k,4K%.5H9. } dYϴ$K>Y_cI6dcI$7$;0eJZű49Iq1iU0#3%(55ʔ<%[2_hQm8j+BK3I~BO Chz| wmB{-Fo&(IR!ʑZKLo%cS 9KSJ>x_RjP ~kg8~t5E88 };1>7eHipER'Ry3 nm.Y)R6QcHс Iڋu#"+[!'"+"o &*eޮtO9יʘ`8 q N` 1!3D)J`u$j帍-#DEZ"+ [bv/_-:z,T<\xA^<_8!}Vj3A}n?TG H68R1ƥgzuǓx=s8W 3fV*p}lK`Fʃl+LHU p cWOI(%f\4x;jǝp&Q${4QPDO*T06 Gph$7FAPVLcΎ2`_3QZsnUim-F8.5!_|mKi C%43#;OI '`?`tZ1ُh9$ 73Ψܤ-nŀ*b@@q"!ꤎB`SsOcn8I0`g*2gc)䦍_FomI 6#jlWF {-t$#+p]fE[$p%za3%D;[jR/!6lKTW&@ʨ%֋x-ifـ鷱`Bͷu8Oq+6]=X)jCZ_nZnx{^shSK/yJˤs M7flDFCNz:UxA՚gZf]p)&pԜQܵj]UI˪PƱbĶ-g rȜWudЮ~0{wǦ|eMb+ rQ*:B8tOC4k`@v{i4cA؞l8) &?;̎ŌO+'qb~H;TN9`38SPUƬ߯,'o;\/|y Ɂ*f+׆O_N>xsS Lk܃]\76e9m $ /RYuN ٛ7kW|ӻJ>Җy"oxo+~ރj-S|Жd0#-Ov]Lta ]y 7L9?YZ͋/eءl*}Wwn"*թ!=15HU5zK;ȫD .y<ľ؎+J#'Ec)gFAɝ.@ X`1Ye5B0y^-7'pnw"UP=7B#@EY`4*vk `ƞymETJI#[ᘗYr::%NZ(9W$yW-^AZyDxrB)#+Ss L{ʨ$aahvK{n 5āξH@{dޫs è@0 ʉWF-E>*K&ʖ6,bWYĮ]5E64,Nр68#V0jPEoM;M*^a}_YF Gk asEk0; 0W4K(D#+TOՂOfJW3PSa,uͽebL( tffFX"9.֨ }"Ɩ;krXwmC.wOs!NY瞬ه|Ǽ[ΆΖ]66>7ywOcBu/L9yKr7&H7 B4zRPڣ&+]e#Ni7 6B2Fy[&S+mDN{ZrhE%\e!"Z>Bul>9#QY\L "SϤZLrCx=ҭ32, hZavQƸ5@Hn -Z3V`?\]Sw-1v("{@]ͩ%c4 /{'@_Fa^p:^Qpi;,9Jpb푪Xt#֯x~rEمc劢Ux~K5+f|fEKn4z\+Lu]IFsF*!KVO 0%jb%&Sn]Aen giV=Md}z ޵}rZomo1^>'to5-dsB$*'y,ܡlҺ59fIjsR$K2Z`^ ia•[W¾V(fF\Jzd pcB{lG#CcuVZ3!lQ9|CRQs0S*0Վl@25wwH"FEވw)mbL2C I%IJȨzCwD+P (g%y!:D%,Wсh45 B5OU %21Y!E0!Zk4^:Q{Ib셢"1ZVD-ڍjt%9=I$6֢Rh5Z$CdHG$ߜ!Hhn8OHBACZjFKR~ؔ-h#+}Ĉ;O{**"VhT ڑDn&Zi%F#h$4#8Md<8kZF?Hf7bT!1fw1Zc)򻊌(y)2UF>Xed3۬ETZP"q `O6MŢnSU߆߾dCO;N^by-V1U%g">!TLbvz5xХp7JdgP)R}!ЬdKj<DK-kw;E5uZL#ghj4n~Rɕͮ5;#= Z)KդR%0yYCnޛx~0gN%r]zF<<^.Wnznߝ5Kqѹ29*$T mSd$\Uyh螹EcO*f%?]|WhvյtRgos]]+4vsW䡛==쫼 PjAIyBH'*'GEuR9%}`4k^igGgC"+i8S'LL%}%X:քWCc2cNGp=l¼ֳ< |Ee;|O Õ;SFu2, (-9tѡJBΤHDj5c{AwuApƺ}A52-fB_WM9^"_Ii98WLmXnة2/Iog&t4vcҐ4=$6^~$cwς$-Ʃ_/g/r&MQ{M(3ˊ&4&< VXslUűZA6z%S LR&ߨ\@?od2RyLNdԚ3VQa?@0]0_/نz@ ple _BP5unk,Ǐ@Xi'%mbњ8ek03p@% $lJ{.='0&sH?@bdmkH+zJ~i[^GDW<Pג\x$ݨd<ͺfJ&F} d.vl3mnjB^0iZ6-|GEjO&1-|.:m+ZZM^4)43'(pH>.33TVIiDES irEp 3yJ9̘>sA0c,o!27r'1 3F@Ddp17i7p^H!uDm=% ^ vS96_9ڏ>} 7߮"]'3'o.I,lNZϝ٥dۧ=}K|ƅ=Ak!툾<>SzWc=vc'Խ<ˍ uZ;)g [{! >9>6L@K3r%AY=vWՃ!&؅e(l6B08FmadvZTfeq>Ҙ~(]毫 %KA[+lee=j3#kzm'(SRGm'kF'HLq֋&w X >@ r6E0qH-Z$agښܶ_Q[,Ǹ_R凔s*+NIMщf4Ŏ dJ%2Ith4nP[uOَb=!Ϣ^#9HS޵;)KHCO˕V<+Hf\c]G2!5<>^zuVpdJ@֧_9R MvQKw1CN $yI?Leiz_F%]+AVu-{OC)SUZ21,rBDJY~nmEbQ"H 2;KPcƀ1r" j9i"/i3Ԩ1p2B{`/Ǝ| “ƌuoj[|ʪCn?;hqߢ}Ϧ{ÏZ4YZ?~p3=3O^zƾ }Os"<%]$ [c!g8ylUD]1’a$O-JW=RUr*`8zc.|.[?ڽ{i K`Qx45O'8D^Ñ630S9=hd[ 0cUsG^p#*Y+| QNtЁiN(##yl{ <#Az1 GPz!_;XYHc>u`X:ϼ*ݼ2%1,G"IG{ *}*ѳOq0)}Jm'O)-i}J#_Z'$)/i*=Z''&}fC)*i? ;}Q6}^/!&-1B9@!%%h$RsE)`.$m  Q0BT[Aj!T s(fAȨW%ݿN g< Gׂu$#סT3.,ٿ!SD|X} ynϪ_TUhjLi9U}2}svu^WOﴝֱyW7#8#@צOHsn$f$κ?+gp(ܩ/ L_PhbOiD^'J MQZJ ! (#7#٫Bp s&>M)08}ԼYٟe ;){@r* ?8F~B9R*<-֝k74|&@Й) 2 =ZFgOUb>|jrҵ8%\)^j"0 A5(Z897>5Yt~M 6]2qW,4yn0R(1Rsk`[<7cvۨ'˟?7H9ms91ptvxvrKٷw31+~\g޽wF<;@ Khۖ^j<5b. ksIb8ߥM@#3N]Bt]4{ná/wEHg&;-ny<'n>8wi;=KA)X:(/g#у0!z8Q˩Gr\O刣#r =Jx9M\N*'pO]%" r;hIn.'A$Qu2au%Nwz0В&#v#;! ˞+߰]3rvBN\N=8X]F]B"& O6. Ġoib3J^ _lDC!l- }&c]BA9%&J1$X($ ]HAa,A9VSsOOk { 'Pk aߗWT^?\ozeRkէ12{X}-89K~ frC05VsQ0X|=Ãu%fß0Br%u> sp{PO5e­L,;1%B:Gk+e.9\֔UJH4%dءuAfO,{J|訵`3pQxнfɸjO~_~zt}Q7uh|n6HZo߯ї֔b8>yIJ++~Zn=TM6_S2㫫.̈́y[TF?Oh۠d'w/o c*o}W8'OSsõ n󽟲Ew(n{׾ 5xP ['<NnRd;}m(S/@i=vf&[woMasM_N9ψ +4|:[/d\k-ڼx7AՋ˫ז_O:PMljy{cVґ3}/2G庩RJn\|*< ՘eiדڶ+ޫIu5o:EYC|zkJ&1*GD&RlRC-6ZG~eaĕ懭3Kfw*+B1]+x[ ?[}멻 _A"!vȬGZ0&0,DR0Xh%X!XcD(ӽ2jvm#WqQN -ʢT2(Jr(%C@IApae.)a]XKrE$jJݨF(a Ҕ%b"D: -R$FbU!TsDϵ~;whܩ{X}.01ǓFFǿ Қc~Q|x,Wwg^>1ƈEN8>.6$|',D2 VW͘tـ3Fzzɍ\zד} s|_Ss>['C{3o|0kP.pVebkƙl%jCTݫ6MJO^NggL^|=S٢VˠD}*,P:FCy\/lʭ[nTMi,GMi:o}^TM=q/QކoiTe!s )'a'M '3u ꄎƺ :^Q-Q_CC*S]}AC&d -T>:F6`HY`F}[ hNA)pq1O'/'sT8Eb'<|.9q9;tXV&~3UiK2@~omCZWڻ_~ 8[I{4ޛp$%ҁ3Zf'0twғ\yMYp^~sW|pE5ҏjkQ\ٕyB JA8+R(P^ 3/1R)$E!(2)cPA,XnᚡBFu3=F>Z,*Z< <ř'x J:̤/zF<_)CC;xDy/V3b$n3BIbzJp {게,\|E@1ĝ*K_zL1cT,|z[OijJo+N{;'^H@X'RD/lTեv"p*QKRA+HNTqr]\3+&cXR )Dђ< +ual1`PE81@kAf+1%E-3W.cJLi {3Kvft4 !g@"W%Ҵ;70N+-j ^C'z:~OdwNFg CsZBui\(#dYf'SXy0#*T)خR*}'&LD0cQi[ggF1*~+2?Ɇޛ ^;7E0(T~ R1p O}1ڔp0$a}Lp yp!s )o\+ZHCRL҆ F]84'|6#AO:A Yߥ(wN(ro8"{#[*=-1/uvcU ƼcYkX%ȃS)ElTU1:pm\m sQJZ읖h¸ 0õXcҡߵ̘6{8 0\OB/C})$bu9&,)R` svwh80T}NEp{aǣp6@pĈ`1 $$Z,7E4P%eHdrW0:8ATcQsቍ 0?iA 8o#z+5d*<@uy!;DA 3?`pu6I],i")S(VR\|}I]+͝ޯI ՕBz *#8N}ԜoFn;N=OJ^TXDiY[W q=gh:ZkpE^nbMef\Z*ͿV)-uq/FhrĂ{/L/}tO}ǯS>jd KtѽaAՔ4f̲~ȵef:.AHf҄ y*ZJD+nx7r1Iޭ,N>vnU?ֈ}n -ݪАGetsZyû1;Sw+˃"S][}5Lsn -ݪАGtJ=ÖC֑;=q.=A2h9MiJF pF5 '<ڊQ:psN VdAT,2$-Na(6lBT $W~Q ,E(㽶`՗Q̵ܷ̒_u1zc38/ks;,d42x2[l8sb28xMvwHbadcTz$K_RL!I5fǓĕbΦL͇j-qnJ{mM?X|Z8? xNp-Hp$jU@;Np!!lgX>~`-*.W×#zy})/>fS"&m~h&&*m}q1onmh)YD0"e{֚s^ )r9Y K[" XR1$> 1AJ( lO۟1ާ D%ً+^t/gӞzg~geNufjU3ֽ%˿a]>o$iMgL6'w& -˶nE  3пa|fA68cXqe˪\D4IW?:n0R'WFFoKuL˝2ie-)Y:ˁL4{YsbT,J4CF;42tu8&;[f߲=.3Y~k6qO- -)~z6y Hr+!g!nPoK^ZҢE/-z)^JgVfF`Ÿ\B2é/,l]iOZ?_$jA(:OC'x el78:fϡ՛{:oaoh!~/نι~swwdY^fÚCh0ka f2X~{0Ao٧">Jd_3J}75ڛA$6yW8x ˠ5xL\}e,Sd\u1Dfd0Tc۰v;iZog9- Wg e&'7x`W)EJ_^d'0ϝo̧Tlcbe w~/A)OW2 n 30w6(-oovEmx`oO^gСƣ!|'] G@&/*W>AD/'%HJGSI3y>>Is6!IFLjE x:La x4>%YٳWavjn)BwPI-b8F ܦLӀ1QyRK[!tPXyR˫ڂ,( bRX%SPRr9j J6$XgIuP˚!h 6>R&}m#⮦.)-j˛;Qa`o\C"91* "ǘ=mq,bEZ&HR4WBL8l !nUg \}0{6 *ʪZAZFI+,=UX[n(´Jr 62?.` y]TEX'Zg3نT CsB*F5,“د36i? L]*#!FT0~hD X2j"H6XfpD2̅Հ :ȗB8$(sLX)SpƞD* AwBco|N ?,!^A`1 VGX-r$ILAtuqYdی꨹FQ r^k!IM3q9Iu~e-VUzF-i W .~h0j5KHDX BM|튏m~@u]78֎fP{[sD: <0^OFe;]D),C i- A^uڤJ[m+`؍ cDV5lPe6h v 5lek kmV 6hP!mјCF<$iGZ/i\07@Qh3:#Č LH8bɰRt?#EAy".o$a6DCH`4-#K}$XbBe;ig *kFkOQ8d cHd$bNYZ9J41$3Iv3NfV݌sD.t[z֪[$gví :Ylw$!-0|vtm 4wo<'Wf?e<z!k|xjKy2gMiMi4͖927.Q -dx:|wI3J}0Ʈo,(|gf6 J_hgVT^x,uyB2C7=KP!n2v?rgy%:_+9W #|R˃EE)r{'IFf;J*cCT%-<ʍEf^D߅r~a#̲5&q;㯽 $O}nwU_,x0C}@Fhġo3 c|Qq4<[$|6w<6MEBTIlp(7=nzQ (#h̆cϛjG_e${qV4x{0H@+ԭOJv+Y;K[b>3Bs-1uYRqw-;`@Dnsk7lnTknWA}TED>f"Y^'1HPܘdsJ0(=-9sV>I(i\YyQ\ɥUU/1:e"%0Ʋ) Y9|eJYTyļEh]EFtQ'E.f)1Uأ1u \ʯtB,n¬#U3 !k99?Pp(V*90'd04I͆kMN `rSP#_RJQ3 @B hʨ 1I>e@$\LhGz ;v bRw5S!4G G]5V ($ЬK҆x7v(( pPVP:Q%hFeeaPV[PFeeaLvuW"TRu`0%ƞZ,)r)GrBce[!(6 3\55\ܳ|`[M%*c]p8ES$ Q"& J)R{α$2& GҠ`JF - iM#Tj*FIY2YZީk0 81C:bWDq8CY}^kB}VmZΞ 2_v{6JrI}#oTmx>]|-?t9vnV~m̃h7˪U\<{eviUa|yv>/Z\2Šu4~(&J,nyOUz;|1oى`a#6GCrmSklY7 ֭)N6X(S̺5&uۇ|*D0VI5AiFv7‚7]ܔ)17;;%*bz'&f#0+yboǜ M6}2^7g6X_ⓟ~Jc  ?8XgT&ɝ&t~7,._'ߊW_oO.&^g~b5|1u'WٙaNb:cͱ )gB45ogcdȝ7op3mvKȂqK$f.M-AϰP}f꧆r _C-d:󅢝zu⧧ϟՋ'ˍΪh_><'ϟxz֔-w)U4dK|gOWV8.g+׽ gYc:36yiެ(ELZ &r}2LB&[x?$`1x!Ď8ƃY7ZRhn;^+g&ٽPwQa/'d2*ap5ZT|:v1􃟥S7 7b'(u!.|:E/O-O^eaΧW |yNx<9~4/^_]}DS& +Y[Wg!HJf/9,$IFL|1+?0ɓLbe^i>_Ck쩡(b@4p֔2YbMIF uq4jrw}8?ܑNVEJsR'{z=Ziҹh$w= ofa"E f#Wk|>| q kHl(K7{Ǹ)0#lH) Ae%2ȬXv(0DK'gAU86 M0Lحݨ +Zƍ UQ@J{t R̎K5Ȍ*?IkR2isH RID \ wq!2ghQ+[q밓\lu6;¢*Mu]q# Qe{lyDd-j ie{Z"Lw-OWz t39Pؗ}uLVTp67yֶRRt<¬K鿞jx'pR:P^}=Q(U8h84]᧣Yꈉ!wg=P$;ﱂ+QZe"A29O%}qљ6"M|ҎGf필l(]Z (In2>f0L5yDn//*e{<*GNf- $Z~GN8[ï:fD1ZՅ_w/h u>jk|T%_t*!6Aw>j3QHBT磶G1ɽU| !#|J*-}9bP.Jb^P#YAĈ5 NEZsR4Xu;mGvJ]\nM=4{}3߬0 RM-ajJSiISd Q$'fZm'w@IU#I_(%qiHd4f#|9M-^ǩEwP4Yo{K|kFR?WPwOnjѯ{ )MT&a)ےj,\*8o,Q0`n"CԽ\=H(2TmrΛV5:αVb/Ԭdp/IA^sT"T95H)U)mDZOͬҒ_ƴ1-eLK~YNK^>9KkGB+2uhM:uy'jVYD1*שJT@Ҥ,+k\;]K5/̨w$ 2o\ѦhGG-ԤC"x^k!6:Bˌ'MS-@[=Yo>{0ym.앀c5 rQ0<=h@WCp4scvl<4өxքٻ$W=4Uy0k{iݞ}nej9N2M:T?x!(1bZqєg"kIR B#=I%,Hq>p:Wna91b=\[Z5XU(:K&q%Q)7&cCpmqZF_Ai\6Wܧ_\|I `0/,4`i@)4u{\qIsg}:%iazd^]~e <ή9~lOBVQ*G5ޫ#W1K !Dt6);uZΖ\`#eUfRɗu*G0gH?̇2 ߳x1\f{𰄉/U&ozTОH& -@RU;ll'|blre|_YӤ4<ӰٓE@YPyx^tO"HBQAVx(aDse2%0rZ%{`MqR{" P:#$i oo<+M e6 AY/* HS Ҕ2⋹5z=|L@#Žilzah{H&*Fކ vn9Ź"h̲.ߕ3Wf΄/`(zbG͉ȃ_Y?5܆2xYh|zt2Grx\_;6tQZf2f3ؽunIç'ϙT swm)-+SB̞]x |tʗ,f+kOX5q=+ڃ+,3.DRpBXËtm6b41kE*°>s4i:o(Kb141́83G~n$5^>3Nu@&ҫ޿aO):폃f H BT/lD^;׋eD9刲~j!c{I0Q;@ʼ1ghO$`_NSl3HMǎƚ*X&=d0D5A܇,\,w<16~oBv-0tڪkZZ =aWXs2:@Jyrt}VkKz0&\z4/lMt|{9q& s]7h R&zȅJǙJE¾k"ZOZ+`)͖unk "4Ysl,Ha$p]- 09pi˻E5]%&vLn4GޟJSs@ H\TPǽmc=z46HwX%Q aaM 8`Z0YMPJŘ4զ~F]ER8(R#7ȗw^gDɋJȐ26kuVJv! 6_6~ءq$0鞩N mg#XEU˻.d5n+yo94ADJa]"O s`rͲ.ȕ7]7B57/(:`Z`W(v\a`}Qۛ{]x'`Ba̰3ƻ jF,#7kbJrZ~JvmE ƯI9%,"4hއP3+uE)ͥ>_?" {f,if*KΫ}dc8 #Oj=lMK +Л>32.U]S_J!O!;>X"x(ʰy>04kE{pLɐn*5y➢c25Vv_>ޛb܆[UL˫\/ xrD\eySYKP8TLY/sc˺?{p&uE@SaC]A TH 8O>[{*$SV[!3̸&0S}CaRn]O$y ̇6a׻0@.U%{rDRK&nT: ̓SݹJQm6@*:7kPkiWOl,21 iX|$48{گN$٪ t(*uZjep5R[g4l`{.xVr%r!r_`+=9;~;:}@KӁR}#`fVw]B2##O/TrRM(5 ەƵIrŸnꎦ8VnOSnKDiALVH*,OQs z.{\%`bbZ e6h"!2RBd[d)ERڲYrKw*~ts5#mv5s!7pm {2~Ju[.y{TˤEalN>>uOM4iPw0R-% QE4냽b甴'Do^IR)8}66񊠖Q6v.Cqlq%ucm$'bG>Im s>˴.n46#zH@-*>g'j$D X# ?Oz9J̝Ԡ|rp[*Ih@B$&XDgJh q'r*BSӼJ UWOT8% ֈaj#)+LyؒUb@Ǩ<=>=J򍷽{kޚh0`c&Ew0|ܴR!DDHoNFDoߍG4/ֿn:`]w?bŴr;d-``+S"1,ooHP.4f!\7߹ǚwn[\1p s (m|VPδ#8V&Ue' 휲,װ$߁(92-/4p.tvN SB j"dLmdP\d!t%Sy/2帽{WikSƩ~ ~v&%TmA zM3jC'Kp-҃62ÖzZt)- bB :OM hukxw>*kqmnL䪼_݇\200qE"8`BEA@\"`(Β6 s']7fA}_|?`SWp m?یU O?bx3ʒ0J0BZȏ8F*"1|R`(b&VHORaDSvC`FQ$FV>*_ca* 11 & "-+xN&cᕪ=Ca `khc:%T t'lM{:fX];AWc7ݪ&9VQ JCnVrK`\LKcM xI@ї|CB&濧ۧ:Q8#L:yNO9ptHQ48M+Y?Fxˬ|,\]a%E`IDPI Q2 X4$" x!SI[~fZ]2^Q.c:vX@(ܟhX!OR0"/Q`q({x@V:`(rv&Xnw4\rИl탣dEIftECi ((DFϙƥmow!S;W V$ X"i1.* EJ3! _mDXDoVP~q0+hi I'ƋV#/JxjE?It+IjǵFuK"[.V%"JxqMk x*}@%VySԃ19cc N+w}mO:TJ͉xTJN̳'k(,1}1BbJ5]E$=U>vxԲ'B3|tk+?a2JY3n 8rqt$ߏAc=pLVtr|8Ұs4Dq!S!/Lbعmf QMYьVL:GɐB{ts z0ld aa='J!_N_h "COBKɒi=3H ,2,T}7X[7x>6YWۼd{ "NVڜgj-}yQUeI FC4LqZomX>=llξnt0dfibJku|ߕ`=Bpħ.Bzp_sE吾SB;"oXSsbM5;U|P^hTM%\mX d7.K0/IfTlO#BKv&d='T]jXw8EUˣNUpq]ms۶+sJ/Nv9m^RYr%:I%Kԛ EtX >bw 17txy@ˍ &:nriv;}X-W_Wvb..5xCG6qt?.Gc9g%ؑFs4EvS%Z߮RmܸfQ{lяNDߏ-]Ef|ΒգgxhYG?+{cY4迻4%vrxo [XX ϫ7^~b&).8vskF37+ĿE+;Ũ(>jmdye%fý{0z9/=j4g7?G!K#G?wxm4{W5|Oւt' +떁q&e~]}S$ 10ƘzFLi@Evr,nuee[]y%Vjm[(۞(xLJj"v|5Cr؍{۶ۣ`CWk9H֐ܹmoQnﳔAOl_d?Pf@cC ߘ(d7%BNE`k{cPzfd%0mٯ(쀐"JO#l[j\:+V@uzc0`v-3Vl`V4(ݘ|o,\B0ܓwuI3kZ:tU},7f-9_Wu--wQ͛F3(SVȓAG [jV]F?bVNYՁ[iiA~̓goj& Nc6sGtlGv1O/,+m3)n)5Eϥ`՗?N\SY M=.3};]пg3%oM^r֝9r=\$Mty$+)Kk8(7oV<kz|㼹S9A1鏕֮l8Ecax]fX)r9>N:yn=Eci`}N3ZSuh|oYKX)йA[O`^Yû>w t)XE0Hc1c@H!'][gf;Wb裠EhV6j P}&44J$SڷdءN%PzleTټ'fP=&W:6X'z'x۟nxW" ikd'̓ΊEdAɧ:2C!pWTSm%: H: 2Dά\CBĤ)HWg@Mh\D":a7f-JuGpcTp'ھݝ.h@;v`0~DvWA9 k 5MnM ^`a!hmPlߡyhQX/gH6V \u V]l0țeS!ZҖ}2&6m>,[9ݖm Wa;[Tpܺ!e%Dߙ.;f7c+B+<.>jMUX TyM2 d bGx`Zo@" BF %ŕV ؋('U72-Z"0(!O;*Fr]]|R }*]ǮC<&ޯs!DHtD1EwyE_Gc Yx24KHP^g( DQ-ABYN1Ob@2c_xwU7ƀkj jaVw4MD˄9֒Cbe*3x\..ckU%U3wSf:=q\Ts6ny'Ęs0¨5s pa&ϡ[5ckXgX3Ng!moKb)%7iM&dEErϝ[P5O`OX1f Zd,>UǼTEj&YRJjfcSkl$IY[1&T& #須6}Lrx`KG49R@L:Ic ,V< Jij}#xqRQ{oׁJӊ r{=4T4 UohJqCAc PmCdP08F~˯dhxj>,P";=KWa\HWO|d֧DH sB !#ȓG"j1a,ë'Oē !Be2A*RDTR\&!Кi kFTR*YB9jS2 *橧 7_UNebډ|3-3hCU9c@{wZ}+UM `Pj߫&*m.&&S+wRa@t/jl=^޵J*(zn] HK r eCkh0:ݯ?ؽWEJ>Hw @wawB $EpfR:L?8W)eu '@EXBMi/M:ÙM %DnTF3HUL(F d0řJfk[{Ws[ l0j3pBiv⁂Hv:bnkb; ɑk JxeB):7:~E/kaW*R–zn*]CRHEd{e!*e-G,d׵KzEGgZO;ߧqbKf)s;7.s;7z2B,:IM* ̈TJI⺨щ'HSn˴pɢ:jvz2r F`tܳ@owEWD?NܥMŘQ~gJ~w =k<_! Ƅw.< `AQ^%cdz{ȠOGB8C)C C"<Bꡔ,(($ьmV!q xXh.df,8T}<`'|Xh mK:4)Z#X4˄Q2%4H$љ /85 $iT6UAbݦ ] RTu"Y\\VGuNw8Xf|~P&k fn"j@)hdė''vXK=nM 1Nz0u3wIld̮Tbvvdׂm`UX>FgK"+DssE.f4xܕ c,/;q][$vLQ]hXAqQ+G5וK>W>>}?ΘF05X([KoJQ HeRHr3" *ffZP')67"WET+'@PJp_N4D%Ee`>71* 6%9 DAqCV"%$ћ*wv(o7>P*"qA"% 0`o&&liF.r0#ZFFJ%E+!9x^%,2K%:- yuAǨ8̥XZV݁ZhQ"&eaw4)oȘBpCI4{d7, cr0 0mV#O,xRT9Jc0hVW;g )P ͮQvy!I:[) J=w_+$1::QX8Ϯ`+.$^tkD>,q I^YbO}s99E F:\\+DQc`(-Z}hYy1㗾EДvDX[,'ofhOn"5{.PDr}D.Z _tn8A\w7?t'өdb|ub~Zu0{ݤ>?X.w˟90 n>HM\"Ó2MxVz63,NBc䦛VǑ҅ ` J-AfPYċK숅xZ2zwEOڗK0`pE5IA)`gW ck~fU!sY1qWK,IZz_$B)?Fv<|x@3 ډB^Q3X0Ya[&8'-lָQ1 tED<Q(|L"!#b䨈Ax ;?,dGNnhECNaJj'2&bX0; nT<^h|BSx?TA.]25e1or"ާ&9LuevϓcpxZGmɄzލ0h3 K/T%m'No5nxO 9F3~ɇ"Yǐ-j ދFX}&x2LmO^2U $_'y.`ߞiO-YHW8gکKpd =~?M?/݌F*vi.K{k9H°s4^&<>?UM-Bfԕz/[NF8[2aa5am{ɫ~wqWTfT#RL*j.Wo?V5*kp|MSsinsЁ ?h Opg'`}?f/K ^rZAO_P*YS*p:psT' %SfW Py Ţ(؈p{Е69 UV:iU%'H"pOoI){,h{C?˟F}YE\ 3dN][';o+J\t,b%!g?ŜN_䠽C ?¯~i 1$7v*>?`p:A4ǡӶ@P~W3ҖRdgeկ6GbūDoE\]-&iW7^m(L\ƴpd۴\ھ:M愴LՔk(\&a2P9wmO|>i}YֹAi26w〥Z8 GXn (DJ ]niCʭ7md CCoD-&KJ#ZKuN")4^YC9>'>0F#F^bBJ%e%S>˹D}k7^sTbR'MUY׳R7;hǠ]U{ܥbi_Jh2Oa6OK6?1P`K1U_8$*#REOaqvw{ѐ`9LYw e! Ȍs8~`qBg)~1Xةc\ӏkiiLDf ǜ* 68~|׿/H0 4o2EӑQ2/ 6";[ihgJ5"Ҁ1Î +(MA"i&%p\HXt*JQ8~TۢLQa ޼sZz-S G3\;Ή~Ŝ-`Fw`i-^=FD+{?ln#{j6atj{ZgM .UZ:]?.s{,SX4Rz6ձ^MxN@G??OU|n,5vA&tFC|V+{ӾxȂŲ?/ThD%dYO Ƚ(\;K +/8'AyV|{SOlr3,r5DBq +3oED.鲐D ,Fp[zxOE?XcsSF"! Y4HK#3g"|VR6+Kp?jHBi.dg%~^{`c^# G=m^ftzNz {+@?gAK\4)Oa2/d갦 <փ?txx)`GtU 6h+b3ˌ @#FCח0`?TF >?NgNQqG3h=m7[B[DeV.|w"qF>k=^JwZw25xM+ҹ9fBOk͠٨]BtMqcػNlEQ FDrJw4Va]J}JՃsHVv#-K/%}4֔ߛ)~K`q­X'\HeXXiLT*bg RD4 y30U>I :pz%^|o"=~X)La x?%'<=yztt0x77a೛>`2]W_cb"NrM(_4mc$]zW^{vĴ4Ӳ+[y2˜b14vڳC\!*zNVJ#^z#`0`:8GD[Pz!^c:7/RsXpL\0[71uVB 6x]s_BIf';/kv2G:x#)# j11i kFcxi/!kPKȚ68SK1lΜu<-&FC"ID &y"iy'G.hkm#9K ~1s`ddsз ˔CR^/!)iċ48^^Q﫮X12a6b?hl4b`v[R[Y5t+oqEVǯm;9rn#ٞudS$$jeƟobi }5R=HmlfzR2/nrZf'ǎ.fp!mkZ+e|S#͗A\kWa_" Lh(rJSƜ)3 ui <99!y@=: iP"(AH1*WO8jPo8/C!u2AW T J9ͮA!b5WX+2%΂a~WV[b.P1XR/vB9< QYR0C & Y !~:turX}ǎw</#偝9k"V=\ @2S"-1yY^@j-(w R4 UH-OZZ2?-;Q^߸{a?v*a)kl(5<,Ns@a#"9u$S+,E /+V+G/AHԁ `Kֆ[9c4&^)u9T/|uɩ/p7(u%0bzPs `1IWv%evtgXX/'qE3׫柩ǯWsoYS)O)I8@uۗ"b\Ujg\8~\,:]_jfsk) N}Pe iAa_A) ءRJI[b8vj!c'o1z֖ )1 KnV; "FIvzm-cLjF~Z#.9~Y| MM9ʓ𛌸%'ˏϟ׆N/Y0#b-[WFj,?}kQ^]0u?NobBK궅{T޷"t}9/ͶED;]C5EqoQ\1$. KP`[K;RNDoiR?Puu[~%d=9%ϞA+ vAO$ ݺ Gqf4:nT TϪǂw<BAf|E$Ch`C/zAOD\\U _rjv?\3NjzV5> &뉞>%L A3T./OAmڕ`]I'E8 IHH!Rջ|]޺MuT'(kyz*O ڄȒmt~ղ2n?DZPV" K]ѱ EHr>ԋŋ]Mq GޑSMORCNj́bR ]v({7(Dc/Eo|'̄=?@@2.-D*2ԕJUG iXh3ݣPqDm)$xS!_851 b|q S".(`xFnqD{9=TNQnF`ũ|YjWԽ9~a&pklL-fYT] |E¹??qz9"d+#"\peӬ>\atڻuV)Z⡍]\O~z;yw!8SCFmq;ᄐ StvEy泩s{ ٥xϫfj[/mzFro`~S{G\UA`;Jur7Teb ZZ_jG^cHlvT3CϒGw2SX%>q𹬫ͻ{=3x> X@aLlYK /l/_f+!,lp|;r6}bgX,Nжz,PqPӵ51o +Q콽0gT+.oǰJp}W`WlP d7@@u zhmoQxC϶NNt¥`(+t^/02k,lL]Xj^xV7p>- +kC/)*hփ:8up'^2Gkk>mr=ԍuz7K4}iJ뼠Q}?菒]yu:+mwCXJ;N>:ufĴ~&y.H}4Yk1a;w[&P13(OuBnqph'^makeb,zZlFlD(\xu)"HN=#QԣM=jtokNcMmݐ'  Ywgdg2C9NsyϕM=rqJ)% B=͵bTyڨh-Fn5jsksAsDQ{\E[!2U-*iOfR 9%G-r$że8nK$#LHì|d )[f#&>5O 'F[9za(ʺҧr*H@-tRL3cCljgm멱gmȉde#K!rZw4ԌG5Ɉf1Ҽ`-6k  MZtXG)͜RQxPC"ZG _mu٩:HVny|%_mqꍒyxآ쎋u'0Rhm2soJRT@Z*ahFV(+].OgH$gyZOl  rBYH\d?+ ؖg4NP0L2Z.1PBڨ Z'7+*DﬢmV`4Oc=BPEɞ:s.NS|6$>#xŗ:ӭyOyJZFmD \lO{w:kr_!Nճ\?$ J_lĆ<癫 |M2M`Y/ԩob3Gpf)Hf NsM&lFAk+)=h]|٧k6B$ީ<"v|b tb'P6ghL͹)! e05GM_əqUx9 @g4>ֹP8~#pU:g#ek&@h<}o~?tF3[޾5y/^P~I5 F# +%r\(mE,0V0!D Xqw0KDA+nWv[Y\u|so~z񢜃ww%b.ޚvO s0BFkʺoY,Kw]!`뽹bek/xFiUV2@D."F ގ?PҔD֫wHPHo3rDhR*%w Zi+@TW9!y@=: iQXjMh <k pZ K^ccE@悁p "9ѰʩNb뵨k ahX@/"sB=:./rR0,;g,k4&ה"qb)'io-V\F )0[x53\8xtZ(%5La[=w{KV %w%O|5+Sr¢ Nu梌-X|v C⽭5[ 9s)kmHe'աvŠĶD$Zn񱐚-Fɔn5ncjX NlM[K"L#5[ 9s)o}n6h9ݗDEڳݢZW3(=5nW.vĠF#ۭ=N甴whm-FԩڭŨvc~H j95"hw.SJ9Gcʘb&pzkIј0f5$ Un #c^flPz0wn/LVڭvo^~JkiR[j#ޡhU%RnO99Ȧ[oP ?ul g Aj-v1_L+glMQbk0 qz۾L/.(Ss7е[W~(QLqnZVΑb 7.o^JruI] \X@9ꍞ : AD'wQLH: Џ$NחW?^zB0xOF2F^e* r,xDGpvuTs&H*R3:_d^rk!3t{d1i*tkl >YVǹ)R CkЏہ5! a}V#A;<ݣjzHh@{C=y CB?Fot4f#;F;z9oHi `HVWSHgeIdlMK5õ:+;׽u?ߵfO~)^ـ[pwnBGQ-Geg{bo?~F2Ձ&O['^ݹy6WLRu}y0 ?̾;>ip--|K3~N^ 2V ׋[ƾ.6? ˇB ƻb"MXZ#@(;+!/8!eò!f˞9dԬy *[Yndܣюr:qdk.DHrg\ID[s =|Wp9r7w8!ЊP ƠL&J Bbڹ dB[6P\-E$~mA]|M߱3S~-Mw!!߸6mz8^nRSnM1#:M߱v;@5nLh] EL.=ysEI&NwkA ߱G"$pMBZI$%25et xKr$y-ɪ.|]mab}->rćђ#qMZb!oP2t.,'h-G>G2Ihp0NProP&gOfVn%n XΈnCLх֋o~Abu\= kXY?JH#TʐȔ&DETcn STd.HPqѧ*W85\y`JIɨ@p2[ LSӀREM#;\[/}!7۳U_Gt \^/׾h7) x76PhuyVj,S~F⨛>a4-OfU+u(X %pj 0LBXCw+ mgbˊӜiJ-~M-^?3WY/pjdzz3c+>X2_O99yVŧjMn|ֻ㢦L?@5[a2y3\̽rO{gkyHm*g톃g+y2J?VC{j$V*˯@Д95I9)άK'3`,A$JJ+HhMVF%AGO#R42|D-bOd-hf sl2go}~6mrqוkڟQi' jBr[^@CJX>"U-Ji:q/=^Nq *kP:zU:o3謹{ƕw6іOLsw!Ý~[ԼNk}z1l(F¾7Xԩp,4ul l77C6o u|rqcQƶf$N i"C%C+L !GNT;8@-q.Y:6 zAnT}y ZR]uwý{ޔ/ _w?@ya˲.Ed^[9L.O;n\;EIdp* OE +iH8EaReccDr$fw\E5R4^p}\J3{I$! x$o<Q41_6YiH)%lh&iR=<ݻ{$.wgvHT3¶7#l1^؄1Jlٺ^͛B(R@OբG@tTR `K"D[ -l@xP&YAQ\CMH:AK6ka]>{P%UQ %b#'# Gp#;43)Q\ iR"al]LRBt.v.RCJB| NsoH,QUj԰VBHBQ拾㕬 EC` f:[oMu%&@~k*)pmX7G7]ΛֺFLZu?^~<,\?u?ހJ=ޢOAf}R7hcpl!FVTJfM?˷^I=JfF%Bw8c졯!LJxY4K%J&%Ƹ C$J0beq8Nbtf$;-aRm\rt!ƽZso`͇,N{ _d}$@E؞R.0? \ KO4eJBpi Õ/U%уY\5'7kOdVJjADZ׏WXcZ^L;g %!]hgNMWZ:|rAES'Akl3>.g#ׄU#w9')*8&ܫ 8N?kAC#ӏ+81.2e_k (uy)xւ:D \YNr= Nb0hR(K!5&A#MѨSƘ҆H e&D'%˜e>>FL+cU ޵ 6z`u`lR$ V"I 4tt(m`DcPEy|fo1曙ٍ1_ͪvCr"8V|N6oGؚb,l5Z${'0&G/gԅ) i!.Bʪ6SpJte7l8#m="BhLJ i J&[pVȨ B,(z aAYo~WtSjvY왹Upۡ[#8~x{y٫MeT.yFhNe]lzqRvS=v|uop9Y6VZ=?~O'>V7/h(}f=G(8I+knֳU Sldj5{[73uazl${l>j]}pٻ7n$k Y6$X` @ l[f4vWF3=R[,?$DW/-i(N̶]!f&K.0ZӡجSJC iSZ)+;5t_S`q~ۦJRV8} .(hOy8C_ V@I#ODhM_dMH=!(ԣR!~>fRRH# Q%+rM1Eұ< @Ԇ2[yϫtu*MDw"nz>Ves|ɓ-L7b\;Me&ơԭ_mi:O.Rş|(9 V%2CneQKXUbUPrXA;A*ȁ~m3ujz#ۚf0CJqvWwcz+G+fǸͿֶĦq=-}SSAP!A ]#çb~E͆p 槮[bSQ)#IBEv1 ebfKĽg7cA .歲*nnoQ'Cm:tmty;e҂O%w.<WuL[gkpsZ{`8a,sp[T ٞ96'uZ:]3m啖>cDiޛn1})w >s}X2VŜܦԠ1D 4Vf'eRT!Ҥ:m\L:g);KPy5 $r,܃AVJ",Jꃇɭ*\݋ A/>_j /8r*(2eT.V̭c2D@|]fRf4 E:d;C$upxvi#;4(. &IF-R@Zw@:XB&<3հr]#G [k۶Y;|zn</??S"˽vg޽pdO Z$\uWaaKyI)dhM;d;(A($,'Y.4%6{7eJ:H\"}x%Ӛ :\-G_#(J\C~%)99{ Q{-~!\ݭkK;yA,"M&Lwu2ϤH Ή"+a*1pw ksE2u}d,KaZ8LF,)2H"Ӻv\T!]Ⱦ6ޭ,3ܧݦM.$X~(d(3ǥh`>3s,sAь8rLṈQt.}=ӊCUǝCzKNXʫlEO"fնp]tՀIW\nv*׍RRn.7&Q^Q$#4s[KQz1I c<WsCFS'hxFˏ覨mFmF墶rQXn$1 ˽b$H唷ˌP΀TX,!q\:ZhDI=FS'?Z9N917vB4Q(Q|T^v#(g=+|P\St"**&Ga()aQyԉ=uㅝ3\  6@ƙEgjQtFegEh]M *Xj.E)~:}tOu’x*K,K։ksOTI>cȽE:\N,Re#_JeJ1}[xqo5QVcٗ:] GN98$(A 1X"9w/:Q/:"-))jx} tw¾ɕb|s壘+seay||\K4z1PS~^9]q>BR'Jɷj ; QI먿9\7Qo.c,Z(kgY>f6Y)n3| !bמּn.?oͲrAjV_ZwDdoӆ RK !.OVŹ7cZwrjlƥd$d,%8W>Ү1S鰴W2@Pb Al],$ur0 sN$սI4%WzBlsakˈCDpFu <+21bW2MXf2:uYQ[gK,cUOc!Qgqb:b:b:b:*kQ*/Da0r繓{/)Srђ@Bh V %Mx-1m%ZH)4MGR?#+>O '>D'A7'2Gdtlo;%߈d;t;#]ɌfاVIb\4(W2B-WLt,8OP)LvFm \Z>¾\}_edR7r % 1i.z=NRxGbS_#mb(-mQ)Miȟcx%jyɋފ_Uc9݅UIQ"~M@ [,oP ՞I"<JS1?[LGtVʺᢧe(<"Tn0hN$()+[*i_p邷U,oݵ1y.$ˏ>&Ɂ&KfPÏ<=EnV?`)fJ.!e>NjHq>wYKW!F Fg+ȐE%wWP2JWl{Vꎰ.q^WKITWqWLh15iCYğ]r甋Z%Ak,^ߴۗ!JIY>Zt<~۸ΚKzN2Gf1Ǚ̚UvbL_qŹ zU+zpϞOOM*rST䨜,WZM: JUP\PDZѤ*9: ,wDԮM(7u]}1QEF7GwqΉ;W'*@@2A$*+#V2D[Ԋ(=B wڪ1BcX5{=B[~1Djbw.q]1Dу3D42Q9SU_jMFL +Eﴱnx[9fbr@~p\%lH@$ѹC}4ߕ? |VAYI=]Y 6dSS{m՘A^}81=V-ܝT7H3^216f bʚiF6ʘ %4LLGw=u-W%sۊ1CK)hӈ1&ten:TO|'؀-Wun[5 &tn+Y;24y(dJ9WV^/V o* @=cϫ9XjLc>+ƓIV4Z^+,|T.ج(ƽ}IKf~UXzz]*ýdXSt㧏vV,QĵW^⭀|b~ }~Gٻ"!{|es<$/ ,< .x3-^\dD"G[=h+܁U:i>۪a0rBh_q|4q(ه|wꡩ/] qNEeVv !{.m7p6ߟ)M{zJ۸TyNv +ҝEs=e 7ڋdTНѶ&(3gc"glFz‹@8';c-!ʟ/˻YG ;A ~W2IWlv[/6[MZv'L,zUEV~r|hwS^p㌇mbEjuBSѲ=cV1V\yϹ9}fFZ3F]NT\` Z50q܂H|%8 Tv+,ҵ$!}!LC!2 >pQ*En`GJf֘'L@i bLvۈ}ZE] TA&cE3JTLc{'b;ґb9Dkvܳ#ܞ N2]fR۶<";FbH#; "\ k'd%z(R`$1!qraN3ddfrCfS <% 34d#zwaHo]3畞I//Xcar$JB dKU1j'M ޗcH*|6PcQ9X|+~/> Yejl,mrBBL'"-$$T*I(qȋDPS3d@$+C,OMzl*8(DQĪd7 a ]A+(Ʀ \Z(ä0bz#^J-9{iNQ̀f3g)& .c8FuEdT "$){`-wwIwQ0"g)HtV OR\ kT'Ixd !Ȁ r%k+/)kw^势QN_;[ vT؛Av+xK&n^r3MnSv[I ]( ]nj_~y! #0n'ץ*I!/޾.3pEV2 oN@6ڔݾvSv'9HjHCVu飸%$S \Sy 1JsUdȈMA\a)ڶ[ c)QGC4CVgCLX%#er3fUNcX nIgn Wbxng $ٞW4`J}ث& EUZDB0 TLT"IC02PpYHC54YՖI6&X߾4!;U)/`ʄ+acvKm wT]$&5Jf)iȶ  (^*ѮA):3r`eg L/v x~ 1tQ@4Wn{9Ac8`4t{~ c( i{ut#t&E܍qFNƶ m;,U}] I d2&Qfz׿Uff劵\׿X8 v#F?qLQpH(> E)B.S_ہ-T5&AI@(Xޫ vf0h0.}xQ7eZ*Jč~ /~1,rºWdP `*ۆR<{>2[GPXo]J8I*0*۽>9 /Q9Q+J{_~S%,bƻ7GҜ*~aa_zCIVmUpo'?s4Z~{#t - %|%p Tda!%@_ҝ,]ץ 5DQ`J3^&/L1&5 1d$gY& ebo8 )BRZ49.yBAIJ\@&I& 1 P&Kʲ:,xRT,|Y}qYd7q2ɭUp4^eYzdw,ۑؿN>^&hl1.{3b0MFAWobߎ.̈AY|d|3ƌ.Ȉo`-I>#v>zͪhE3z2||e;%n>L?ߔƋ82D]5A/K! }'p_j;M,\.\40c?u_Ik`^Q:)׬'8NOr?=cdotdzB/:gsr@Fċg')~xjg̾tOOG=i4M^&e=͎x(uf/T;l=4u&y& cxv;rE6:hڔOt醈}MN2]LoMYLx&7?y<;G!7~gq8,ܬ>pj|ݎٟ|5XqK<rFF2MQ`>5\}[(N.[zGьȰ 6v/kΜ'$9q 7drwp9Fzln<غAAj a@' 20 +XxDsy* X&òŪ߇d}c %“ĴebL:o~3 %ڇ9PP&tZԃfom]i-^#$C%7tGm'۟nd<W|#8LPYRTiYeB{A~iݵ 'ڜO_ $De kΒדiC*T< \m7X:-㴼8Nj+e]mtexpF8P+%5)qq7JySKm fctu%1@cwA1^V5;+1XU@Yaݦˢ,v1el[+gyS9M2\v*KƯyh\KrA1 ˳i>/?¥f$jIyT+CiqϏ6cq7kVusNFz(Y[t/*PΗGB޸-S8d*=Tv*?]_Zݱ`hXrEZ5~7cwCv_=8(O+vT^je^:2:F+'}}~5؎s_nT%W]^7ղ#ubX`<Uu~]JQ H6xns[N S+H0 D9C"LMך*&yrJR͑ۻI|@>2C|FYF89=qG72tI/Sb>fk2 Nܴ8&`Bbco Ɋ*_MIOj<^`>N(6nyPJmq»6P+ZTkkWk2?Z05m/¬6i%{`J6^)]fHCNվ h8h1~Y?-?1v?1^=Huu{lѮCE a[V9 Zo!J=K͏iWa 'Mxе֋Fh[-@0-@hBy&̋ћpWV}7bYޱ4|/~^ =Giϸ¦]`qya(Ku- 4Z 3b}8/#E 7E}#6A_YRp 27ĀHdmwC$ ]}<)5-x3}Cןc8C%땖pw AR| v-ł/wsŊ7hAY.PE\T$RdIVd(לs1~TIdp$zI)EB[0\;wi#N!6T*7T|O9BĜ-F`GuNM"k޽CFd4 PC7ېb-9gȴ倮@e :WMZaN-#W7Z>{\MP$l5hOn?^b9ƪep&n߿\?Jp㣷>>ʿ % kx__erS7kU?ކ.N4_N_ቔGdWU(<ђ>mULyqd:ݒvL^2#Ofb=jZZZƉޯqQ;tg+4\! G1TkD1 B8DŽjv W.VO@.1>M.+8ɟeZ8< b b6`\"^5o8 vܙU]G~{a.NM,6R8 vnܗ,{rGWݶWq^Eg ě7tVtVEUq*+WcSd:.Ln?8ۯ?S3'\cn7,$ك̂V(Pئ>p;]q׊Ct>AbNi.gW+@_ Aj³Ep8YSY" ߉ZX[ڕt?*i~܅[F⭫:FJ֑q?`Kc~΂6(Vҷś#D>{%:FK8/T>>oo/"(싥{a垧D4>eїȚC8'Y&BYo!:P1(q:BYH{ڸ{3 9:?!/95 uF- -v5 kmEGA IwlVv?9oi7yp*< \"NdžFO3ڄ!ffZa%a}$ʕ< ,6mN&D0jO Wߝ~Nk `bUPL0G#5ԬbgdbTH܍tMW5;8}zk2u|yI?ޓߣ8G*u?3>E0!% S.)m|ᯯ K1*eLwk/Lwi^^x>Yك^rtQ~2^dNg1f"ٻw_#fp ԼoJEiE,*?ʲ5)U+;,jܝG?6]oSLZat~2{;OC:zD%j%dARǰ2x/)Vq>D/Q0uLM o$X̢0~a\FJIǽ|[c6By|z|@s5ׄBA{*! G=JW4g:\{sfh袘y5;\\!zFWlOnE70C. b}̂'R`LNH -7clSZ6*uMH3w˭`Ԭ WfϗL`J0!"i PSh,;7}}~[=/qT08>r_spVS|Y wls*-xTbXbX"()!J !(Pp%MKYS]VGU<.dv _yv[F"v1Mz+Iu;xGEN~ջv$wnc s-ƒۼ9Qe_ԌCZVQĠ6zQuc%K4̰ԍNvcNL$߯ 1d^m]`Lļnq[sn&c `iuGBSǩpB uD%u ΄| 1=HvG*UR>.R`5j YJj=njdAulWQǾMwv6#pJ" 4G!L C&YP@<HMZPcF ]jbJ^ <a4!2/G)`q< "B6G0p̰ð7ې \+$2m9 _fp{Iԏ2Rz{uGDGrA%Qd~%c zJ kk7>zH# | }y_IW ψo|<2aKIެUTxֳgқn|9'QE_CچnQx%m}n_7i<gO-iT8/0k&fףVaUU-ޝ\CWOIk/18|!a +$,vN rc=T=@Uiry+.%..)L8ȟA(N#,'[X/m$V.x<]n.pQ};U~xqjb/{ȱKuTxX@;m"ocIp I,N.L~j<~TSʈmLSv|n!R1N1߁.F@ݡɳN̨tj?88NRqƝ[rNJVWutck7i>BehYBHa6 v{|{KSXSTי ^]X)W2TDz듚ݨՄh%дkB yNknVg$eO]Rts_+o =:WĦF5</ Oj]\jZ s0=YmձZ'DWٵLw6yxivxJ&²\( 0|}o5c*oX|=1:c(7d|W\ȻA zTN>r _E!*Sdy?|kÒ><|Kg^d|a_O@›/zC~Ď"{u:)5޽t%%|Kur|[Ľ ̱F X%GUm8|[!S+SF֩OfKɖ1}P!sY!zH<d]rvg*vxrc칚oabCD\qn mM?U KB|e9)?XvžA Eb@L̋kj ;-wKJqY߁Jij([=T,ĿkJY` &7E^3$o:~wBHSl}R=^hʇSxB%N{)+JH+@zhpĀ@t%Q÷*K՘0L`UɔcU sFɺ}>F?mV7*Ŏv:n)*oGwnkb[VX7Ɯ6FX*K$,[4)vL2l1םܳʛwLttOݙ<cՎN;w>N@D;mgœClK.7(F~ɶnaKj.1>^~e#nt%FYbf[&@MEqޓ`i-mbLbf9;-YS)mn,utl։τ|fQ8#9TZ.kt8,i;avł^2G,6c Ј#6`#6 xH 2C1(p$ GhK/2CL:$ ($0Re Y& Dq$(xHB. 6~14 /`8tTj$ sGo`:٣j#mFx2Nws1X@™Bڻ}np{x+7ȇ‡_B_^Wҿᾤ>!p3L~R 7kU?ކZ4_N_ GԧkCU(<ᒶ>h'm*QM ?8~2nI;j}^31ӿ Kubm-D׿~JZ{ mc+z یs*ݐM/PHV>{bg:c@ANG59tc5@omSE42-*48:$j8]ˑjN]٦r}&[@]^.yr'Į7{$Eyx^A DGm(Xb1A:oQR,6e8\yNQ.wٮ {,{Y0zk0]'QȀcnBj׷6sF ),xq. 8D0I%v=Ү ˫Cm?7p8T=܎C~q?g7p c|wolɫs)cNwn+Z O p{dzLJڡmB}v^X:Ih?=h91f=2QO0Qf)!z4Qrw}Zd!S̅mMi+$wӍߌCt>bQ\겵^6m<*0y a_}ɡi`(N٬-(H!iUp0/ fi4M(ђ%pܕU S'℘0R _}"H3d#qjUx@kPP_fL&ғ0jO Ծ;.a-={ĪF5<%axWӨ'm;U pPk\|?'.Lӭq7VK݁ #er^?&^fA,a*ʢ5 _z[g*EX|=1:c(7d|W\ȻAzTN>r _@Kދ_D-S7qWE$LhwCǤ {DPD %AwSqH- {Fy TJvTa"e鎑?m$K+?D{6}e%ȡ% *l'Y_u1à$꺺z $$n,?,# 8.Zc`Ƃ!plPHPRI"B hS C mYQR2 iQA6wYҌaCxbt;y':s"}mWqk`A\E'[K/_#Vf恳0NImvfzH? O,ovw6*2)rAiD2Ě W1vt-d/ \{ @?j·Pf3ǽAnmHaF4}BKǃ^ދ{Ypwʾ;ڀybӫy 8p|عw =1chn'ecu렶<_V9zi醃+[K iKؾ↓L߷AflEݍmSiYta,Է4Za vn@>frv xjNfS ,a|xu+ل`9kcF<-SXSU ;´΂nR}:Q&6Ӭt~38Hfgs0%I@ۙ x^T1lk;`_Du6CP,4L1ֲ?㖛32唾[K"&E[ D[q^zAQJe^"$YdM ^Ҍ~5(M-1V/MvÃmk?~ܤgd%-[Yd|X|9+6>p=\)Y~td;[ Ju(A$; PA{,xq;Fyb]J"!DqBAIlR)125 0Bm]?Ϻ;AƱw.Ts4ݸR]Ϙ x9^,5t:&J!['nZ]yXoT)" sãuzȮ`K27-)sS4(usZ+ynMaH,}01χ=/)Qi%JqLLjK^;2-T0L I-:KeuP mx:TلJKE&dɮNvM stN`\rId " -0*k4huqBCF1<!g u1 d[c|ݴ\2ken!+YyISmWw=53`X0waWz?t/_@LF9'>* g ?ȟ,裁y^gZa 6J q4Ey7GeȖ\<\W$MoVkxI~-UHz}+V8Ww{AA107~GȘ Kd}FUmXQno8CDHV8(+t>$;:ۗm|/w`< ^Pe"޸,h-T}O''KKkDp ~JXiJzÐ;R@i{{;E4.5(yC.ip!R R$"QC13)T4Iz^@@0}ۧby)f^ו`ЋlzQHX "v pjsi,A@:T4F4 cY0RL,IIi_ڳLZ^<&ƿ]{Ĥ(݁{.OR7Kf)c%4j`i9YbM(!bQɥLyܤ6+ Ff (Vw=0NF$ y[ g8nj$ G$ aLPl IL6!EbX *`TdNfʤ8mLmmT.TE ))_j#edlw3+ZLè@ b\1R6C#f\1%kke3:Ê+ Rl9PY9|L˩kAP61:os%{Jl,cH{'K<zv"cݡdn4z>֣ݷ|;lU _RŇEM-Gm;bKfؙbMG|܃];8ڿ@! h}8}0^s#9|Q E|&hy/eޱ+8h^h~fLNQg`lp|عw  hyF^7?u; .&pfﮛ>͖Lk5;4/NN7\ͽ,\ZL[rw8-PĖMjrFu LyIv &n/&Q侚T0s'T0ۃo}Ga狙lLt+g~:"śuK-;O^0I;,ޞ1k2Jxoңw,:d|>M4Pt |va-0, pO;Dgfp X]3Ͻ9wd>gx{|gwt{&v>ׂU9N|f_ëa$,oyjx/-*ލw?Ѿsx?8tGߎ'~G?$zөaW\NlhY:k`| zݜtw+}{8Tu4J |Q76NQsW{u80hmZx~_t3&4v< &*+mN )$(qwӫ-X,FM&=-0n~jZde %ݛhJwˢ+KLX\icm?/1K5릹>O_;n '!+];xUrhxu&w[uECIWS=n4BʪGwU$R'ca7#zmћH@_8n)#f}*/e7 X}=9noWV۩Z^zZ` *]Yz(nF'E ~*I~,C?*[D %z2*C }̨O"<+BxAZG{f?j9sGs󾵘h.ۣ&9aeXx',L2^Y y*Sx}I6g+ՉvBAW=>Ӻu!\EctJ"{súQHX%Typܷf˛via9?;/<\_K}&e~3XGv˵ 獖[hV"BȈ4±jD;GÕ\lCܫJR2ۿt7X{[\YNeB)x*ԗBHؚ?쐰`vK?t3 vs"vrKNB&f$t$b_>>?PY , }HoVTd0?ΰFPLPF3 rI `N Rl~l\AI(\#Nn/փfފ&k+L&=57B#11*+5 (Z9ҀhPgu_xB ZN.&n)cx!Q)z('9dKe S#(bB,Ui 8SHbg[LAfJg 9P_ME13@禐 EwdeL 65. |ty0|"v~@R 8a0$"aӺvTaG8DW vԺ>ع\#Pz~!grDq®7aX{n0ڙsZu>#lɻlҽW:\JS#ty KԐ׸ES$6aaši)F-gCm&O`[s0"QXd$0BIL3MюXp-F&/jf<X3P^Jk"D'5 @Fr5!Hn! c@줖=XI^T]ySQ}ץ?wzZin Qv;dqWN#F^9jjd|~=a X)@eg)ZH+ F .9Fy>MU}}^5()_t>J tCLs_;.׵@NMioSPua-,5IMioJS:;)MjJҤ4)MjJxS򠴥)85)MW$)SlJS/.0|c[g`ŐEfW_=6WXm2L+VG}?? )1\A4b),t$3VJv|~rד>Ӹ8Q}= y`dKa0)  QN~Hc !j$VZ/ͤ’50PAB1-=8N3VBR3NZ5|dXhi%C&>KnC),ZQYpL$ (yHS q% .$M-1'F4z}y5J\@ vk rf4ǚ3Q,7 Y2kT87:gjv0X?-ͶAO1{^ 9WDP\fڟ5s91ark2kUJ s#N-*ӕ\<_Vu; }LU[遾}k^^u)}~._>rY Ӿ#G#xs1D@A`CkTe2l8-vgNj/!dl>|8P~5i3 Y-7ө}ض9$K|c$u3r@^֗&Hτ(fϡlKTؓAؘ2nєrEX e夬rRVNIY9)+'e夬 RJRVNl'D1}W}N.1OKVM:c>T ]c@m}/)v2h[BP0.J!BNS*R*R*A[BP>9)y*ȫq6P*>Az(CAn]K`{)u{)mykmR¾m$1Q%1q2̄1L#ʭ)|{+#f܃>St01%Q~G^$?5颤[s:ī[{ج+`!%SUr{ͯ^On_OA_>[Յ5ىaPY6(i3W~ްn-V>!ޗDY8֭ y*Su >I}8mq滝#[l >o-^CtʳjcLK9uz11i`6P Q 5F-?\ K:;νBgw'ٲ-{(eOO-ss=g]O-s`l`i3M UeMLvYwք ެ;-Y&ٟuG'vĎ]O֝8;3ifsb"<+H&@~ء(wYw{b\0a*fj&v ~1))뮏Qj:+m.}v|YF_dgül (HDώD.D"J$D"J$D"J$D"J$D"J$(Ы&Q"m `H%DfO/qUU'HIʬY 8Y9P7+ʼYMDXfHj g5abeIOXtٗu'=kʺP2l2 Y9xVXG'IF'vU$hKiUJ 2Ϻ ÎDϺKeZk]v4piyUP7Q`{ (v BX=,P8%(/e9(LH G+Cʃ@FO#(">)(gE.DHRJ4ЌvPaP! )e3a , U6Z(.V>jG)ԟ-Wi㔥ROζa-P`v`j^ u95jɊ?|6覢/F4"0-<ßgpٞU8ާ͚p(1\A4b))HPgBcp̧unY %e(YȮ߽GX\lMa $IzJnuM( ?:fR&2-SN?넂FɧSglir5ubv;|5p#6q(H57Y?^697o*{ύk.ól x?ςDm1McTYQ7̨:ELD"0ދhz/^~80I ưlaۯ_峏Zl]Ϧ._e܍orUܠVa;~.?M5e>֤A|#(%"H05jr4|Dr}[{{a,lpp<62ڭPFȃ#Y`Gu`O98iێ[+<FaacdI+UyCjZ{.Y,\<Okx 6 6iC:\ .q 2='ǀ;51}їZjQ3JrQaeP`aMʀ H*\"VRڍ~l)&;*`R@+_ aK;sB3 qC5-rʼnP r[qUu+\`/+~ƿ@.L@VGҶ*W_TcF2x%||xbž#WFKw?bY?OZ,^3%v:ϻn1zǃ_B#1 M+wĸ9YGƀck]J Dco )=nkho2ԚU!ؗ `ԉ)Hύ *#?+Z3prsGorط]I$dIܒ, $ʚ{ڴv %g6zCyuhOgm4tNm)܆ٛbOuSJ :M\1L5x M9/y[b\$yfc)|y= > ={=CmP3 =Cm=Iڞ#顶ڞJEP=?bmO!:s[ !=G{#'CoJ qǠ`B0,)Gb(B:INA2"c90#W*H[.Z%3 z S V ~_eHBK@fQJ!hbHJ>B%reiD`Ψ-U=WjWY?cW &ӵt mycP<>FƔ DqGĽTq,Dlyr %;0ql#흂&P ˌHNmjQ!LA;9W<݉ta FUܠwւ)R-9}^,,ʲyiOVn]ϧ/O."Hr oG7QFIWwfwo~2nh\HpTaŝha/GOT1\xXWTWt5OeU(0S]Qwu o][%VsߦݛF@!{H5õLw81@( )>;5[߼lQOਙ7PM.>"pB#Y$xY؟FfvpɪK 9'%2Gw鹱dNAw=wQ':SwզhaNܘzb&'#KU[ŧ;XPGlg]K'1Bty*ܿqĠm-gY"Yb"N@>x/w(ipki)blş)h90Z3rHO㨻27 h|r|f Rh! P`L8dH͚u澡0iBio 6n>(kZ5[PeISdAƂI!HWce\g)3l_.zK#Ք}@?.(vOF i|gRqU~Je =߸a؎a #۩hUQQReE=N;gd_r@p%X59v{_Bl\-πǠ[jXAv {ʰ\ 8oiJ.ݠ>D)36@I=}hLT$Ϻ,Tsle__aoy2%U8sQQ};%9V#e)/1%,P(..Auw zOppw{?6TxZIPRRd}PO Q%jtN*"dԂ)2ݘ/Cm(uiA@  ҔK0#JQ`>~;f-[ݚ;,C*w#W.)ؚӨi)7g+m,-E-NF$T(%aU0^UNpFP{<ۆ'@D׍b=[Ml‘J) a beirMC80-kC'RwxM-+@ϣgg)pe⤍"VMZU=Wn(1nQzu󓤧F)@d~[Y01y{/iUvHo9aĘQIX79=mUFw3v6g}T$0X`<.]ӟTQ"(F# f3-{cO(׫7)ol}LVȞ 9O08€⥻f|c sA4zA_3jBH-ARX Gmrq:4Tw$ZZn|ͬFbLDk7_^*Ay#,@Z2D~uu\m;l[l奡qqSf<ʇ"Ljsh89ot z"ZOv_+n|t6]w ᮹lv<;5ѵ{Q7~N(QI?/N]}k$u #QxNoݞs`ޭ=1rwB^>ݦzbLz+me=ohDY뙛.N C-bTb cJ_X֖ҵ"wtĶI/3c‘쳇vKFI+3QKbM̻JEMGs(|XW9iD?5GO> Go=݉hyiR-[*B]1x%Hx,l뗰 VjQ9U.UR+5ٚ/,˛Y;L6fؘsS\ҝjp {^&bsQw nQVzL+1yǹՄLʹM =lbqDXD>A|S _44b0%[bmH(?P\nZ €F)0|_GTDRLu߸77l׬ݵOn  &o({ 3D·U7c%|Ça4!nfdq_S_PA_9:|ICSVv;bbRf8Xlq4vjjjV+OM!Ѵq4qjI轎 w՘+\Y]{dG5'PBV@WY]D;PxR&wG@5X* ,Lis䘉|ySHV"DRxcbՉpka;LXsGkn׍kU؏9bY)Q_#7ӂQSI&i\ʧJ$ QsqӁ/H ӹ_S=My+ɔʩR#G^$ RGRǘHfB@'b&XHg&A0TRh;$ռ(ްJ0&/W#KioG90;\dXI} dK5_/UKɸNGN^Ax"gsDݍAHH|5(z{PP4CkŹe`B$#Df2 #*7[8V1+M軍RU):@HA5tgjnyALo-)3 L¢x>՜ Kj [tjpࡋ{;CpLkz{~w)7 A?3 2_-QP /he<+(0zэ$}sKuU_]jPKc(oo/ΏKMI/ DK~ @ʆ3cR.--9OVYl{pt&F_o iW~X4 N%W ~Z UQh- (|ى|M};Jn\GÏ0fu/zᓡa.' It{uQ}txXz;+hG`Ǥ/0Ž,%٠wׇvF^:` hb+~>?M}Lj]ͧKkXT*7^ck=زr3Ӗ~0@=yXlRd*fhz cTmg Ko.+ L2*b}*b{YwJ.WeJke=xشkb*)>=PvD03I<pYlȡpئz)f$^+ ڱ]:pQ@S'wiЈtiJ*Ǒ$gQc;&qֲ1S=&Ζ E9ڞ,Ef1܋z8 j?2a֋h:ذp2°bjX>-UYܲQ/%ʄ=GGee q&bM5afxބ jy-Wv(br|u1H2({9I}+!īU.NE_LLypǃF=8 GH$q̉Fa84d!b*CR0{q"lԙf#2?}%=ITJ-0B:#  ""ƞKllM*BP@73 Ϭ@ *'M:ihC~4!4T_rLc-$A>DLc&CigG/Q:Q6\]_?s@0ƍrj*0f?h0~Ď#,Z2ʤ4+d>NB%Wvcj}T ,NmӎY\X5.-3.[ryp,H`ݻnлmߏ)fԋy{ed-/l)eꍭٻU~qyyrvC 8% }e? R!Fɘ*[ߚw⻳n;v7ZO39u -߫per7瘾HۄZ;V\MSqʕbqAsU۹HbKq-X%)nmvUQEw,8|RRvH&x.v[(s.̜O^!Z %鮶I 7>ߕ4sBS)aKQ%™3YP쩣d4Lǻ)7LЁިB\X&1 !UQ砵6 #te!0ٝL-ɔZ6SʮC3YR{$#T}zxrc(Ɉ2ݐ*ZD(QM3ZV.Sy/[n*;ZMh/t #Kq.Y7UV.Syg3Lf݊'E9ZMh/)Nat :]ZOwQNkA~6KG5wX[Kn_8~YRpZ?{6AMn/dUWzO]&U՛ WMPUTUo7U՛͏$Uo\57z'( RUONSyM2EOQ}DRJi0M,8ʤwU)NYהlz.- u,REUr}-L.ykieaJafT4椒]iszvFp%Ҍ;-;Z K2 ngg=;7"* KImovar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003367667215146177114017730 0ustar rootrootFeb 21 00:07:43 crc systemd[1]: Starting Kubernetes Kubelet... Feb 21 00:07:43 crc restorecon[4697]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:43 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 00:07:44 crc restorecon[4697]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 21 00:07:45 crc kubenswrapper[4906]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 21 00:07:45 crc kubenswrapper[4906]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 21 00:07:45 crc kubenswrapper[4906]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 21 00:07:45 crc kubenswrapper[4906]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 21 00:07:45 crc kubenswrapper[4906]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 21 00:07:45 crc kubenswrapper[4906]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.273556 4906 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279283 4906 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279315 4906 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279326 4906 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279336 4906 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279345 4906 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279353 4906 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279362 4906 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279370 4906 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279383 4906 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279393 4906 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279402 4906 feature_gate.go:330] unrecognized feature gate: Example Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279412 4906 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279422 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279432 4906 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279441 4906 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279449 4906 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279457 4906 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279465 4906 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279476 4906 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279486 4906 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279495 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279504 4906 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279512 4906 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279521 4906 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279530 4906 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279539 4906 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279547 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279555 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279564 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279573 4906 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279581 4906 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279589 4906 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279597 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279604 4906 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279612 4906 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279620 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279627 4906 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279635 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279643 4906 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279651 4906 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279659 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279667 4906 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279674 4906 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279682 4906 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279714 4906 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279722 4906 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279730 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279737 4906 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279745 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279753 4906 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279762 4906 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279770 4906 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279777 4906 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279785 4906 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279792 4906 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279800 4906 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279811 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279819 4906 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279827 4906 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279834 4906 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279842 4906 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279853 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279860 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279868 4906 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279875 4906 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279883 4906 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279891 4906 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279899 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279907 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279915 4906 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.279922 4906 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280076 4906 flags.go:64] FLAG: --address="0.0.0.0" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280092 4906 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280106 4906 flags.go:64] FLAG: --anonymous-auth="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280118 4906 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280129 4906 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280138 4906 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280149 4906 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280161 4906 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280170 4906 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280179 4906 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280189 4906 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280198 4906 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280207 4906 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280216 4906 flags.go:64] FLAG: --cgroup-root="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280225 4906 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280234 4906 flags.go:64] FLAG: --client-ca-file="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280244 4906 flags.go:64] FLAG: --cloud-config="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280253 4906 flags.go:64] FLAG: --cloud-provider="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280262 4906 flags.go:64] FLAG: --cluster-dns="[]" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280275 4906 flags.go:64] FLAG: --cluster-domain="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280284 4906 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280294 4906 flags.go:64] FLAG: --config-dir="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280302 4906 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280312 4906 flags.go:64] FLAG: --container-log-max-files="5" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280324 4906 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280332 4906 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280342 4906 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280351 4906 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280360 4906 flags.go:64] FLAG: --contention-profiling="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280369 4906 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280378 4906 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280387 4906 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280397 4906 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280413 4906 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280421 4906 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280430 4906 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280439 4906 flags.go:64] FLAG: --enable-load-reader="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280448 4906 flags.go:64] FLAG: --enable-server="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280457 4906 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280468 4906 flags.go:64] FLAG: --event-burst="100" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280477 4906 flags.go:64] FLAG: --event-qps="50" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280486 4906 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280495 4906 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280505 4906 flags.go:64] FLAG: --eviction-hard="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280515 4906 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280523 4906 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280532 4906 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280542 4906 flags.go:64] FLAG: --eviction-soft="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280551 4906 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280560 4906 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280569 4906 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280578 4906 flags.go:64] FLAG: --experimental-mounter-path="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280587 4906 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280596 4906 flags.go:64] FLAG: --fail-swap-on="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280605 4906 flags.go:64] FLAG: --feature-gates="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280615 4906 flags.go:64] FLAG: --file-check-frequency="20s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280624 4906 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280633 4906 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280642 4906 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280651 4906 flags.go:64] FLAG: --healthz-port="10248" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280661 4906 flags.go:64] FLAG: --help="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280670 4906 flags.go:64] FLAG: --hostname-override="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280679 4906 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280715 4906 flags.go:64] FLAG: --http-check-frequency="20s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280724 4906 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280733 4906 flags.go:64] FLAG: --image-credential-provider-config="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280742 4906 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280751 4906 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280762 4906 flags.go:64] FLAG: --image-service-endpoint="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280771 4906 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280780 4906 flags.go:64] FLAG: --kube-api-burst="100" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280789 4906 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280798 4906 flags.go:64] FLAG: --kube-api-qps="50" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280808 4906 flags.go:64] FLAG: --kube-reserved="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280817 4906 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280826 4906 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280836 4906 flags.go:64] FLAG: --kubelet-cgroups="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280845 4906 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280854 4906 flags.go:64] FLAG: --lock-file="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280863 4906 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280872 4906 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280881 4906 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280895 4906 flags.go:64] FLAG: --log-json-split-stream="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280904 4906 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280913 4906 flags.go:64] FLAG: --log-text-split-stream="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280922 4906 flags.go:64] FLAG: --logging-format="text" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280930 4906 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280940 4906 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280949 4906 flags.go:64] FLAG: --manifest-url="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280958 4906 flags.go:64] FLAG: --manifest-url-header="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280969 4906 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280978 4906 flags.go:64] FLAG: --max-open-files="1000000" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280989 4906 flags.go:64] FLAG: --max-pods="110" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.280999 4906 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281008 4906 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281017 4906 flags.go:64] FLAG: --memory-manager-policy="None" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281025 4906 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281034 4906 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281043 4906 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281052 4906 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281070 4906 flags.go:64] FLAG: --node-status-max-images="50" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281079 4906 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281088 4906 flags.go:64] FLAG: --oom-score-adj="-999" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281098 4906 flags.go:64] FLAG: --pod-cidr="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281107 4906 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281121 4906 flags.go:64] FLAG: --pod-manifest-path="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281130 4906 flags.go:64] FLAG: --pod-max-pids="-1" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281139 4906 flags.go:64] FLAG: --pods-per-core="0" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281148 4906 flags.go:64] FLAG: --port="10250" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281158 4906 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281166 4906 flags.go:64] FLAG: --provider-id="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281175 4906 flags.go:64] FLAG: --qos-reserved="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281184 4906 flags.go:64] FLAG: --read-only-port="10255" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281193 4906 flags.go:64] FLAG: --register-node="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281203 4906 flags.go:64] FLAG: --register-schedulable="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281212 4906 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281226 4906 flags.go:64] FLAG: --registry-burst="10" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281235 4906 flags.go:64] FLAG: --registry-qps="5" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281244 4906 flags.go:64] FLAG: --reserved-cpus="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281252 4906 flags.go:64] FLAG: --reserved-memory="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281263 4906 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281273 4906 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281282 4906 flags.go:64] FLAG: --rotate-certificates="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281291 4906 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281299 4906 flags.go:64] FLAG: --runonce="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281309 4906 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281318 4906 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281327 4906 flags.go:64] FLAG: --seccomp-default="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281336 4906 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281345 4906 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281354 4906 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281363 4906 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281372 4906 flags.go:64] FLAG: --storage-driver-password="root" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281381 4906 flags.go:64] FLAG: --storage-driver-secure="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281390 4906 flags.go:64] FLAG: --storage-driver-table="stats" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281399 4906 flags.go:64] FLAG: --storage-driver-user="root" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281407 4906 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281416 4906 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281426 4906 flags.go:64] FLAG: --system-cgroups="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281439 4906 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281454 4906 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281463 4906 flags.go:64] FLAG: --tls-cert-file="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281472 4906 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281483 4906 flags.go:64] FLAG: --tls-min-version="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281492 4906 flags.go:64] FLAG: --tls-private-key-file="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281503 4906 flags.go:64] FLAG: --topology-manager-policy="none" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281512 4906 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281521 4906 flags.go:64] FLAG: --topology-manager-scope="container" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281531 4906 flags.go:64] FLAG: --v="2" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281543 4906 flags.go:64] FLAG: --version="false" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281554 4906 flags.go:64] FLAG: --vmodule="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281565 4906 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.281575 4906 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281792 4906 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281803 4906 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281814 4906 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281823 4906 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281831 4906 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281839 4906 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281846 4906 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281854 4906 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281862 4906 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281872 4906 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281881 4906 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281890 4906 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281898 4906 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281907 4906 feature_gate.go:330] unrecognized feature gate: Example Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281915 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281924 4906 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281933 4906 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281940 4906 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281952 4906 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281960 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281968 4906 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281975 4906 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281983 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.281991 4906 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282000 4906 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282008 4906 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282016 4906 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282024 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282032 4906 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282042 4906 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282051 4906 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282062 4906 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282072 4906 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282081 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282089 4906 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282098 4906 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282106 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282114 4906 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282121 4906 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282129 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282137 4906 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282145 4906 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282153 4906 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282160 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282169 4906 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282177 4906 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282184 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282192 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282199 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282207 4906 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282220 4906 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282230 4906 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282239 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282247 4906 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282255 4906 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282262 4906 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282271 4906 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282279 4906 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282286 4906 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282296 4906 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282303 4906 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282311 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282319 4906 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282327 4906 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282334 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282342 4906 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282350 4906 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282357 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282365 4906 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282372 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.282380 4906 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.282407 4906 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.298955 4906 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.298995 4906 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299135 4906 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299148 4906 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299157 4906 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299165 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299174 4906 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299182 4906 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299190 4906 feature_gate.go:330] unrecognized feature gate: Example Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299198 4906 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299206 4906 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299214 4906 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299223 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299231 4906 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299238 4906 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299246 4906 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299253 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299261 4906 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299269 4906 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299280 4906 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299288 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299296 4906 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299304 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299313 4906 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299320 4906 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299328 4906 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299335 4906 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299343 4906 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299354 4906 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299366 4906 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299375 4906 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299383 4906 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299391 4906 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299399 4906 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299407 4906 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299414 4906 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299423 4906 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299431 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299438 4906 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299446 4906 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299454 4906 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299462 4906 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299469 4906 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299477 4906 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299486 4906 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299495 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299503 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299510 4906 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299518 4906 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299526 4906 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299533 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299541 4906 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299548 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299556 4906 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299564 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299571 4906 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299579 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299586 4906 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299594 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299601 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299612 4906 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299622 4906 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299630 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299639 4906 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299647 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299657 4906 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299667 4906 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299676 4906 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299712 4906 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299721 4906 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299730 4906 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299738 4906 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299747 4906 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.299759 4906 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299972 4906 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299986 4906 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.299996 4906 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300007 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300017 4906 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300027 4906 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300037 4906 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300046 4906 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300054 4906 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300064 4906 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300072 4906 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300080 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300088 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300096 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300104 4906 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300111 4906 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300120 4906 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300128 4906 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300136 4906 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300143 4906 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300151 4906 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300158 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300166 4906 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300174 4906 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300181 4906 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300189 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300196 4906 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300204 4906 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300211 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300219 4906 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300227 4906 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300234 4906 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300242 4906 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300250 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300258 4906 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300268 4906 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300278 4906 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300286 4906 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300294 4906 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300304 4906 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300313 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300321 4906 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300330 4906 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300338 4906 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300345 4906 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300353 4906 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300360 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300368 4906 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300376 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300383 4906 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300391 4906 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300398 4906 feature_gate.go:330] unrecognized feature gate: Example Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300406 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300414 4906 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300422 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300429 4906 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300437 4906 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300444 4906 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300453 4906 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300460 4906 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300468 4906 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300475 4906 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300483 4906 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300491 4906 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300498 4906 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300506 4906 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300513 4906 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300521 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300528 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300537 4906 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.300545 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.300556 4906 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.301814 4906 server.go:940] "Client rotation is on, will bootstrap in background" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.307125 4906 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.307259 4906 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.309064 4906 server.go:997] "Starting client certificate rotation" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.309113 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.309360 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-29 11:21:48.75662541 +0000 UTC Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.309500 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.341548 4906 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.346305 4906 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.349419 4906 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.361596 4906 log.go:25] "Validated CRI v1 runtime API" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.394094 4906 log.go:25] "Validated CRI v1 image API" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.396242 4906 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.404136 4906 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-21-00-03-02-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.404246 4906 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.418274 4906 manager.go:217] Machine: {Timestamp:2026-02-21 00:07:45.416750431 +0000 UTC m=+0.668337947 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:94310220-1d46-4502-bb0a-b3628ff11479 BootID:4d8b0bdc-2182-48d0-bb15-cc57765305f9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:42:b9:10 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:42:b9:10 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e8:ba:4f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:49:26:9a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d6:56:03 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7c:84:32 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5a:06:39:02:35:96 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d2:e5:c8:8e:ce:69 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.418641 4906 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.418882 4906 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.419680 4906 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.420035 4906 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.420090 4906 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.420429 4906 topology_manager.go:138] "Creating topology manager with none policy" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.420451 4906 container_manager_linux.go:303] "Creating device plugin manager" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.420945 4906 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.420993 4906 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.421874 4906 state_mem.go:36] "Initialized new in-memory state store" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.422040 4906 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.427800 4906 kubelet.go:418] "Attempting to sync node with API server" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.427854 4906 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.427882 4906 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.427905 4906 kubelet.go:324] "Adding apiserver pod source" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.427923 4906 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.435540 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.435708 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.435663 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.435940 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.437667 4906 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.439776 4906 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.441458 4906 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443299 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443359 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443384 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443403 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443434 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443467 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443484 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443509 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443528 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443545 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443566 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443582 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.443631 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.444487 4906 server.go:1280] "Started kubelet" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.444804 4906 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.444932 4906 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.445596 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.445760 4906 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 21 00:07:45 crc systemd[1]: Started Kubernetes Kubelet. Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.448824 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.448878 4906 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.449132 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:10:19.556935245 +0000 UTC Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.449634 4906 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.449669 4906 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.449756 4906 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.450183 4906 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.450297 4906 server.go:460] "Adding debug handlers to kubelet server" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.450657 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="200ms" Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.450669 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.450911 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.454408 4906 factory.go:55] Registering systemd factory Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.454514 4906 factory.go:221] Registration of the systemd container factory successfully Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.455068 4906 factory.go:153] Registering CRI-O factory Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.455144 4906 factory.go:221] Registration of the crio container factory successfully Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.455258 4906 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.455348 4906 factory.go:103] Registering Raw factory Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.455419 4906 manager.go:1196] Started watching for new ooms in manager Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.454256 4906 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18961a573f5d7ed5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 00:07:45.444437717 +0000 UTC m=+0.696025263,LastTimestamp:2026-02-21 00:07:45.444437717 +0000 UTC m=+0.696025263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.456232 4906 manager.go:319] Starting recovery of all containers Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474763 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474864 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474886 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474899 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474917 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474940 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474956 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474972 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.474991 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475043 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475146 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475162 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475178 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475201 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475220 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475236 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475254 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475298 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475337 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475353 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475368 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475388 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475404 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475419 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475436 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475479 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475499 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475520 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475534 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475547 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475562 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475577 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475589 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475600 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475612 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475625 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475638 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475652 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475665 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475677 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475765 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475778 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475794 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475810 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.475825 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476315 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476336 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476357 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476375 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476393 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476442 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476460 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476487 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476513 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476532 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476553 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476597 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476615 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476636 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476656 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476674 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476709 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476728 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476745 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476764 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476785 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476804 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476821 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476838 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476858 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476877 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476895 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476913 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476930 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476948 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476967 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.476984 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477002 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477023 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477039 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477057 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477075 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477092 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477111 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477133 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477157 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477177 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477196 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477218 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477236 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477255 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477272 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477292 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477312 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477331 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477349 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477367 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477388 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477405 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477421 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477442 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477461 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477479 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477497 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477522 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477541 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477558 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.477576 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.479853 4906 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.479899 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.479920 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.479939 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.479957 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.479975 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.479995 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480013 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480031 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480054 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480071 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480087 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480105 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480120 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480138 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480155 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480170 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480189 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480207 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480224 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480241 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480258 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480281 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480302 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480319 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480337 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480355 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480371 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480387 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480404 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480422 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480438 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480453 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480470 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480489 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480508 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480526 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480544 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480560 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480576 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480602 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480620 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480637 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480653 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480671 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480704 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480724 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480741 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480758 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480775 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480792 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480809 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480825 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480843 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480858 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480876 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480891 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480911 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480927 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480956 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480973 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.480988 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481004 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481021 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481036 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481052 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481067 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481085 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481102 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481119 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481134 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481150 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481167 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481186 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481204 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481221 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481236 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481253 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481269 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481287 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481304 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481318 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481332 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481348 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481365 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481380 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481396 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481413 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481431 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481447 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481463 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481478 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481492 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481511 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481526 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481541 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481559 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481578 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481597 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481614 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481626 4906 reconstruct.go:97] "Volume reconstruction finished" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.481636 4906 reconciler.go:26] "Reconciler: start to sync state" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.491759 4906 manager.go:324] Recovery completed Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.506042 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.509883 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.509924 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.509936 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.512384 4906 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.512418 4906 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.512450 4906 state_mem.go:36] "Initialized new in-memory state store" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.513491 4906 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.515596 4906 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.515630 4906 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.515833 4906 kubelet.go:2335] "Starting kubelet main sync loop" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.515882 4906 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 21 00:07:45 crc kubenswrapper[4906]: W0221 00:07:45.516448 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.516544 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.527318 4906 policy_none.go:49] "None policy: Start" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.528190 4906 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.528223 4906 state_mem.go:35] "Initializing new in-memory state store" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.551180 4906 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.591069 4906 manager.go:334] "Starting Device Plugin manager" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.591145 4906 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.591162 4906 server.go:79] "Starting device plugin registration server" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.591719 4906 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.591737 4906 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.591939 4906 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.592024 4906 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.592032 4906 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.601576 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.616379 4906 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.616526 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.617813 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.617861 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.617873 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.618031 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.618337 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.618426 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.618814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.618848 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.618865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619019 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619193 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619225 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619854 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619890 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.619879 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.620604 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.620659 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.620673 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.621217 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.621301 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.621342 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.623141 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.623185 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.623196 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.623142 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.623328 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.623382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.624471 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.624498 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.624763 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.625344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.625390 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.625407 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.625570 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.625634 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.625646 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.625721 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.625752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.627154 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.627185 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.627201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.651974 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="400ms" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684182 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684212 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684235 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684251 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684308 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684360 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684392 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684424 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684458 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684497 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684535 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684572 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684612 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684641 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.684671 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.692166 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.692975 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.693006 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.693015 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.693056 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.693406 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786216 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786280 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786318 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786344 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786364 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786386 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786404 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786423 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786419 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786452 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786535 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786561 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786571 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786587 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786627 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786561 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786530 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786601 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786581 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786783 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786814 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786906 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786639 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.787016 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786670 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786908 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786855 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.787174 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.786639 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.787347 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.894389 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.896023 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.896069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.896082 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.896114 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 00:07:45 crc kubenswrapper[4906]: E0221 00:07:45.896652 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.973005 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 00:07:45 crc kubenswrapper[4906]: I0221 00:07:45.982336 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.007130 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.013627 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.028084 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-259ebe85f5cf6104f0a4e26e3750707fff650d225d52e87343979eed0dd6e89f WatchSource:0}: Error finding container 259ebe85f5cf6104f0a4e26e3750707fff650d225d52e87343979eed0dd6e89f: Status 404 returned error can't find the container with id 259ebe85f5cf6104f0a4e26e3750707fff650d225d52e87343979eed0dd6e89f Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.030622 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fb7f4bcb9e3c2b055737551ef12ca724ff7987a0f4b2e945a2c72ccd97388f64 WatchSource:0}: Error finding container fb7f4bcb9e3c2b055737551ef12ca724ff7987a0f4b2e945a2c72ccd97388f64: Status 404 returned error can't find the container with id fb7f4bcb9e3c2b055737551ef12ca724ff7987a0f4b2e945a2c72ccd97388f64 Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.039287 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.043461 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-99b2bd0ceef7d058987edb30ce89e7e1c510e368f9c57c47997e18b1803334fa WatchSource:0}: Error finding container 99b2bd0ceef7d058987edb30ce89e7e1c510e368f9c57c47997e18b1803334fa: Status 404 returned error can't find the container with id 99b2bd0ceef7d058987edb30ce89e7e1c510e368f9c57c47997e18b1803334fa Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.044945 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-79e484105e6795d5e926ba1314f28430edc196d303cfd0cbe194b8ec369cadf8 WatchSource:0}: Error finding container 79e484105e6795d5e926ba1314f28430edc196d303cfd0cbe194b8ec369cadf8: Status 404 returned error can't find the container with id 79e484105e6795d5e926ba1314f28430edc196d303cfd0cbe194b8ec369cadf8 Feb 21 00:07:46 crc kubenswrapper[4906]: E0221 00:07:46.053281 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="800ms" Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.065129 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b289f36484c93ba3d7683e761c8a6982670d17493de39005d154eadb367286d3 WatchSource:0}: Error finding container b289f36484c93ba3d7683e761c8a6982670d17493de39005d154eadb367286d3: Status 404 returned error can't find the container with id b289f36484c93ba3d7683e761c8a6982670d17493de39005d154eadb367286d3 Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.297631 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.301350 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.301396 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.301414 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.301447 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 00:07:46 crc kubenswrapper[4906]: E0221 00:07:46.302054 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.447237 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.450188 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:24:55.167865531 +0000 UTC Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.465157 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:46 crc kubenswrapper[4906]: E0221 00:07:46.465279 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.522637 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b289f36484c93ba3d7683e761c8a6982670d17493de39005d154eadb367286d3"} Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.528096 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79e484105e6795d5e926ba1314f28430edc196d303cfd0cbe194b8ec369cadf8"} Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.534415 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"99b2bd0ceef7d058987edb30ce89e7e1c510e368f9c57c47997e18b1803334fa"} Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.534763 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:46 crc kubenswrapper[4906]: E0221 00:07:46.534857 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.536188 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"259ebe85f5cf6104f0a4e26e3750707fff650d225d52e87343979eed0dd6e89f"} Feb 21 00:07:46 crc kubenswrapper[4906]: I0221 00:07:46.537870 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fb7f4bcb9e3c2b055737551ef12ca724ff7987a0f4b2e945a2c72ccd97388f64"} Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.775627 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:46 crc kubenswrapper[4906]: E0221 00:07:46.776299 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:46 crc kubenswrapper[4906]: E0221 00:07:46.854417 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="1.6s" Feb 21 00:07:46 crc kubenswrapper[4906]: W0221 00:07:46.982088 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:46 crc kubenswrapper[4906]: E0221 00:07:46.982203 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.103110 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.104803 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.104869 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.104886 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.104928 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 00:07:47 crc kubenswrapper[4906]: E0221 00:07:47.105571 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.446960 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.451401 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 07:36:15.762666132 +0000 UTC Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.460518 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 21 00:07:47 crc kubenswrapper[4906]: E0221 00:07:47.462703 4906 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.544183 4906 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635" exitCode=0 Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.544266 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635"} Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.544331 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.545758 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.545800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.545814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.546993 4906 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f" exitCode=0 Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.547036 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f"} Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.547020 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.548122 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.548150 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.548161 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.551056 4906 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="00beda5ca660a5e8c6df9410952b3875c1f107b2b68d5d792286766879fcc4c3" exitCode=0 Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.551102 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"00beda5ca660a5e8c6df9410952b3875c1f107b2b68d5d792286766879fcc4c3"} Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.551170 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.552304 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.552348 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.552361 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.554785 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c"} Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.554820 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac"} Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.557982 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9" exitCode=0 Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.558021 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9"} Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.558132 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.560955 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.561618 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.561634 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.565369 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.566347 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.566424 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:47 crc kubenswrapper[4906]: I0221 00:07:47.566443 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:48 crc kubenswrapper[4906]: W0221 00:07:48.437254 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:48 crc kubenswrapper[4906]: E0221 00:07:48.437341 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.446488 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.451914 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 22:34:52.921231912 +0000 UTC Feb 21 00:07:48 crc kubenswrapper[4906]: E0221 00:07:48.456997 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="3.2s" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.563369 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.563427 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.563436 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.563442 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.564778 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.565002 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.565072 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.565229 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fef2c643e773be0c9669aeb65a0e55346520d2bcd0d52ea0b2d7fcb105a59276"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.565261 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.566054 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.566255 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.566327 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.568941 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.568975 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.569333 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.572789 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.572836 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.572853 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.582004 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.582058 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.582073 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.582088 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.584027 4906 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410" exitCode=0 Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.584067 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410"} Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.584163 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.585472 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.585510 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.585522 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:48 crc kubenswrapper[4906]: W0221 00:07:48.621389 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:07:48 crc kubenswrapper[4906]: E0221 00:07:48.621599 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.706077 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.707290 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.707373 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.707389 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:48 crc kubenswrapper[4906]: I0221 00:07:48.707449 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 00:07:48 crc kubenswrapper[4906]: E0221 00:07:48.708397 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.452131 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 08:04:19.621437692 +0000 UTC Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.590228 4906 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b" exitCode=0 Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.590280 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b"} Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.590428 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.591916 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.591957 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.591971 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.595031 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f"} Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.595219 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.595173 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.595236 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.595116 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.595176 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596668 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596712 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596725 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596730 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596762 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596779 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596850 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596882 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.596901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.598017 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.598040 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:49 crc kubenswrapper[4906]: I0221 00:07:49.598050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.269811 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.453088 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 18:43:55.184824818 +0000 UTC Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.607628 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81"} Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.607725 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf"} Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.607752 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323"} Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.607732 4906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.607809 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.607832 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.610338 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.610392 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.610433 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.610447 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.610408 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:50 crc kubenswrapper[4906]: I0221 00:07:50.610512 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.107246 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.453273 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 19:02:57.794350339 +0000 UTC Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.525910 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.526096 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.527775 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.527828 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.527846 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.610557 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.617106 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02"} Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.617180 4906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.617221 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.617270 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.617193 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4"} Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.618616 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.618733 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.618754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.618815 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.618862 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.618885 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.909349 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.911209 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.911265 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.911282 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:51 crc kubenswrapper[4906]: I0221 00:07:51.911316 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.154651 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.453705 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 14:13:21.227263146 +0000 UTC Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.619511 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.619582 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.621421 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.621476 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.621495 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.621555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.621587 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:52 crc kubenswrapper[4906]: I0221 00:07:52.621601 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.334798 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.454511 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 07:45:47.162827384 +0000 UTC Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.622558 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.622587 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.624625 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.624679 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.624747 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.624824 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.624867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:53 crc kubenswrapper[4906]: I0221 00:07:53.624888 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:54 crc kubenswrapper[4906]: I0221 00:07:54.454880 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 06:42:14.793300197 +0000 UTC Feb 21 00:07:54 crc kubenswrapper[4906]: I0221 00:07:54.497313 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:54 crc kubenswrapper[4906]: I0221 00:07:54.497573 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:54 crc kubenswrapper[4906]: I0221 00:07:54.499308 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:54 crc kubenswrapper[4906]: I0221 00:07:54.499377 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:54 crc kubenswrapper[4906]: I0221 00:07:54.499405 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:55 crc kubenswrapper[4906]: I0221 00:07:55.455124 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 20:28:48.820220002 +0000 UTC Feb 21 00:07:55 crc kubenswrapper[4906]: E0221 00:07:55.602710 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 21 00:07:56 crc kubenswrapper[4906]: I0221 00:07:56.455848 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 06:36:06.314219761 +0000 UTC Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.175963 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.176642 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.178232 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.178298 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.178313 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.181340 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.456529 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 06:26:10.242169275 +0000 UTC Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.498052 4906 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.498156 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.559027 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.559212 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.560555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.560588 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.560601 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.608567 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.632992 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.634378 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.634440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.634460 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:57 crc kubenswrapper[4906]: I0221 00:07:57.640491 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:07:58 crc kubenswrapper[4906]: I0221 00:07:58.457057 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 03:59:21.275601065 +0000 UTC Feb 21 00:07:58 crc kubenswrapper[4906]: I0221 00:07:58.636628 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:58 crc kubenswrapper[4906]: I0221 00:07:58.638050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:58 crc kubenswrapper[4906]: I0221 00:07:58.638242 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:58 crc kubenswrapper[4906]: I0221 00:07:58.638377 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:59 crc kubenswrapper[4906]: W0221 00:07:59.166620 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 21 00:07:59 crc kubenswrapper[4906]: I0221 00:07:59.166745 4906 trace.go:236] Trace[201551121]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Feb-2026 00:07:49.165) (total time: 10001ms): Feb 21 00:07:59 crc kubenswrapper[4906]: Trace[201551121]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:59.166) Feb 21 00:07:59 crc kubenswrapper[4906]: Trace[201551121]: [10.001211682s] [10.001211682s] END Feb 21 00:07:59 crc kubenswrapper[4906]: E0221 00:07:59.166772 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 21 00:07:59 crc kubenswrapper[4906]: I0221 00:07:59.446983 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 21 00:07:59 crc kubenswrapper[4906]: I0221 00:07:59.457430 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 04:19:27.931652171 +0000 UTC Feb 21 00:07:59 crc kubenswrapper[4906]: I0221 00:07:59.638913 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:07:59 crc kubenswrapper[4906]: I0221 00:07:59.640177 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:07:59 crc kubenswrapper[4906]: I0221 00:07:59.640233 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:07:59 crc kubenswrapper[4906]: I0221 00:07:59.640254 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:07:59 crc kubenswrapper[4906]: W0221 00:07:59.735212 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 21 00:07:59 crc kubenswrapper[4906]: I0221 00:07:59.735369 4906 trace.go:236] Trace[1606777307]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Feb-2026 00:07:49.734) (total time: 10001ms): Feb 21 00:07:59 crc kubenswrapper[4906]: Trace[1606777307]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:59.735) Feb 21 00:07:59 crc kubenswrapper[4906]: Trace[1606777307]: [10.001221611s] [10.001221611s] END Feb 21 00:07:59 crc kubenswrapper[4906]: E0221 00:07:59.735411 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 21 00:08:00 crc kubenswrapper[4906]: I0221 00:08:00.225095 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 21 00:08:00 crc kubenswrapper[4906]: I0221 00:08:00.225191 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 21 00:08:00 crc kubenswrapper[4906]: I0221 00:08:00.230087 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 21 00:08:00 crc kubenswrapper[4906]: I0221 00:08:00.230193 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 21 00:08:00 crc kubenswrapper[4906]: I0221 00:08:00.279750 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 21 00:08:00 crc kubenswrapper[4906]: I0221 00:08:00.279803 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 21 00:08:00 crc kubenswrapper[4906]: I0221 00:08:00.457615 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:35:40.379362867 +0000 UTC Feb 21 00:08:01 crc kubenswrapper[4906]: I0221 00:08:01.458733 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 22:08:50.663841361 +0000 UTC Feb 21 00:08:02 crc kubenswrapper[4906]: I0221 00:08:02.459395 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 20:53:38.791195736 +0000 UTC Feb 21 00:08:03 crc kubenswrapper[4906]: I0221 00:08:03.460366 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 04:21:02.306949097 +0000 UTC Feb 21 00:08:03 crc kubenswrapper[4906]: I0221 00:08:03.857988 4906 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 21 00:08:04 crc kubenswrapper[4906]: I0221 00:08:04.461345 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 19:50:36.84378194 +0000 UTC Feb 21 00:08:04 crc kubenswrapper[4906]: I0221 00:08:04.535283 4906 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.203566 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.210791 4906 trace.go:236] Trace[1717428038]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Feb-2026 00:07:51.672) (total time: 13538ms): Feb 21 00:08:05 crc kubenswrapper[4906]: Trace[1717428038]: ---"Objects listed" error: 13538ms (00:08:05.210) Feb 21 00:08:05 crc kubenswrapper[4906]: Trace[1717428038]: [13.538614424s] [13.538614424s] END Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.210848 4906 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.213753 4906 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.213855 4906 trace.go:236] Trace[973397031]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Feb-2026 00:07:54.612) (total time: 10600ms): Feb 21 00:08:05 crc kubenswrapper[4906]: Trace[973397031]: ---"Objects listed" error: 10600ms (00:08:05.213) Feb 21 00:08:05 crc kubenswrapper[4906]: Trace[973397031]: [10.600824592s] [10.600824592s] END Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.213877 4906 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.219908 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.247832 4906 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.277172 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.282890 4906 csr.go:261] certificate signing request csr-98v6s is approved, waiting to be issued Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.283051 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.289701 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51022->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.289768 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51022->192.168.126.11:17697: read: connection reset by peer" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.290089 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.290115 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.290206 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51034->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.290276 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51034->192.168.126.11:17697: read: connection reset by peer" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.291259 4906 csr.go:257] certificate signing request csr-98v6s is issued Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.310853 4906 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 21 00:08:05 crc kubenswrapper[4906]: W0221 00:08:05.311333 4906 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 21 00:08:05 crc kubenswrapper[4906]: W0221 00:08:05.311421 4906 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 21 00:08:05 crc kubenswrapper[4906]: W0221 00:08:05.311573 4906 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.311475 4906 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events/crc.18961a574344da9a\": read tcp 38.102.83.136:42750->38.102.83.136:6443: use of closed network connection" event="&Event{ObjectMeta:{crc.18961a574344da9a default 26175 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 00:07:45 +0000 UTC,LastTimestamp:2026-02-21 00:07:45.619873958 +0000 UTC m=+0.871461464,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.447433 4906 apiserver.go:52] "Watching apiserver" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.451102 4906 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.451449 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.451883 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.452028 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.452166 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.452268 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.452371 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.452414 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.452441 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.452302 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.452801 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.461469 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 11:54:26.880551708 +0000 UTC Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467086 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467094 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467142 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467423 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467157 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467433 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467155 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467647 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.467721 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.501099 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.516773 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.544925 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.550376 4906 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.566036 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.576610 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.587736 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.601015 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.615714 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.616206 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.616263 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.616311 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.616720 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617045 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617120 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.616338 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617229 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617255 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617253 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617278 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617347 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617370 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617389 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617407 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617424 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617442 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617459 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617475 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617492 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617538 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617554 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617570 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617586 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617606 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617625 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617643 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617679 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617726 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617752 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617885 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617904 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617925 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617942 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617962 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.617980 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618014 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618032 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618048 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618064 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618095 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618113 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618137 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618163 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618202 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618228 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618250 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618294 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618318 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618341 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618364 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618388 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618392 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618408 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618435 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618540 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618563 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618578 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618593 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618598 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618608 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618652 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618701 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618729 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618754 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618779 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618813 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618830 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618838 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618876 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618894 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618912 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618936 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618972 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.618993 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619039 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619050 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619070 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619281 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619090 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619388 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619414 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619441 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619461 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619464 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619480 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619497 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619524 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619552 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619572 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619587 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619607 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619628 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619652 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619673 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619742 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619766 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619787 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619808 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619827 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619842 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619856 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619874 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619891 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619905 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619922 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619937 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619954 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619971 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619987 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620009 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620031 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620053 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620073 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620094 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620116 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620142 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620190 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620214 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620235 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620258 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620280 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620300 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620321 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620346 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620365 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620387 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620410 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620433 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620455 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620478 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620557 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620591 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620608 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620632 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620653 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620676 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620718 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620745 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620801 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620828 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620853 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620875 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620900 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620925 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620949 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620972 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621094 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621129 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621159 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621185 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621209 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621234 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621260 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621283 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621309 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621372 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621395 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621416 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621433 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621451 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621468 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621489 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621570 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621587 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621605 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621624 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621641 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621658 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621674 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621706 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621726 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621745 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621761 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621778 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621797 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621815 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621831 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621848 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621867 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621885 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621906 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621923 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621941 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621959 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621975 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621993 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622010 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622026 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622042 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622059 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622075 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622094 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622111 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622128 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622148 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622168 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622187 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622206 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622222 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622239 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622257 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622275 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622295 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622314 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622360 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622398 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622429 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622449 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622476 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622498 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622520 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622542 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622579 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622601 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622622 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622643 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622676 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622709 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622781 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622792 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622802 4906 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622811 4906 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622821 4906 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622853 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622863 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622872 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622882 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622892 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.624883 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.637280 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.619951 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620119 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620314 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.620484 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621855 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.641306 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.621863 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622131 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622306 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.622992 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.623202 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.623517 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.623880 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.623913 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.624140 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.624160 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.624415 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.624419 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.624679 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.624712 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.624940 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.625015 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.625543 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.625920 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.626365 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.626511 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:08:06.126478956 +0000 UTC m=+21.378066652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.627364 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.627383 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.627669 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.627721 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.628007 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.628370 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.628822 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.629151 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.629395 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.630465 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.631026 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.631476 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.631609 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.632003 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.632235 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.632255 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.632274 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.633301 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.633678 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.634216 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.634534 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.634563 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.635841 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.636091 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.636324 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.636491 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.636525 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.636786 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.636794 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.636836 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.636879 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.637084 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.637299 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.637438 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.637741 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.637840 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.637954 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.638203 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.638280 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.638376 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.638384 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.638560 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.639157 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.639211 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.639457 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.639697 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.639855 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.640217 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.640803 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.640881 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.640880 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.641024 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.641119 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.641556 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.641791 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.642240 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.642295 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.642467 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.644215 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.644407 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.644543 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.644570 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.644844 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.645168 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.645200 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.645380 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.643245 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.646031 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.646128 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.646411 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.646435 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.646825 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.646996 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.647133 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.647271 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.647604 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.647647 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.647654 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.647893 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.648002 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.648159 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.648581 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.648618 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.648828 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.648940 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.649044 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.649094 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.649421 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.647918 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.649923 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.649971 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.650284 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.650305 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.650442 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:06.15039679 +0000 UTC m=+21.401984306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.650474 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.650724 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.650751 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.651782 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.652042 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.652260 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.652465 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.652515 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.652626 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.652644 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.650436 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.653331 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.653512 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.653744 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.642783 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.654435 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.657808 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.657837 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:06.157793525 +0000 UTC m=+21.409381051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.657835 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.657910 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.657930 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.657977 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.658141 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.656424 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.655457 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.655543 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.656094 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.656222 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.656265 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.656290 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.656292 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.656949 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.643099 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.657306 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.657424 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.657599 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.658380 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.658512 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.658547 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.658722 4906 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.658811 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.659169 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.660922 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.660998 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.661288 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.661392 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.662181 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.663137 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.663426 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.664111 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.664321 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.664944 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.665102 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:06.165073807 +0000 UTC m=+21.416661303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.665305 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.671732 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.672076 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.672114 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.672130 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.672187 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:06.172167844 +0000 UTC m=+21.423755350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.672794 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.675849 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.664675 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.682972 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.683089 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.683188 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.683268 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.683312 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.683373 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.683500 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.690246 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.690275 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.690510 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.690502 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.690601 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.690627 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.690600 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.691200 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.682990 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.691441 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.691670 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.692418 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.694583 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.700094 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.705940 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.705958 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.711715 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.714297 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f" exitCode=255 Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.714429 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f"} Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.721473 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.721892 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: E0221 00:08:05.722274 4906 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723142 4906 scope.go:117] "RemoveContainer" containerID="104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723472 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723550 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723616 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723649 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723672 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723707 4906 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723722 4906 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723734 4906 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723751 4906 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723769 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723781 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723796 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723811 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723824 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723835 4906 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723846 4906 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723852 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723858 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723996 4906 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724008 4906 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724020 4906 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724032 4906 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724042 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724053 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724063 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.723640 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724112 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724155 4906 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724172 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724188 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724200 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724227 4906 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724238 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724247 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724256 4906 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724267 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724281 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724416 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724426 4906 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724436 4906 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724445 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724454 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724591 4906 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724603 4906 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724614 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724704 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724718 4906 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724728 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724738 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724749 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724758 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724794 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724810 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.724827 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728743 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728789 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728804 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728857 4906 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728871 4906 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728883 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728896 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728905 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728915 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728924 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728934 4906 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728944 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728954 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728963 4906 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728976 4906 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728986 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.728996 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729006 4906 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729015 4906 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729025 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729048 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729058 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729071 4906 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729081 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729099 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729109 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729127 4906 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729139 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729150 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729160 4906 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729170 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729180 4906 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729190 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729199 4906 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729208 4906 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729217 4906 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729229 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729239 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729249 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729263 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729276 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729290 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729302 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729313 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729325 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729338 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729351 4906 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729362 4906 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729375 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729389 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729400 4906 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729412 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729427 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729495 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729514 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729526 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729540 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729552 4906 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729567 4906 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729580 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729593 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729614 4906 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729626 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729641 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729653 4906 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729665 4906 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729678 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729709 4906 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729724 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729736 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729750 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729764 4906 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729778 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729790 4906 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729820 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729833 4906 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729847 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729862 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729889 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729900 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729911 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729920 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729930 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729939 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729949 4906 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729959 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729970 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729980 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.729990 4906 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730001 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730010 4906 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730019 4906 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730028 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730038 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730048 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730058 4906 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730068 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730077 4906 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730086 4906 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730097 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730110 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730126 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730137 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730149 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730161 4906 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730172 4906 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730180 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730192 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730201 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730209 4906 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730219 4906 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730228 4906 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730237 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730247 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730257 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730266 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730275 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730284 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.730293 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731733 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731751 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731766 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731781 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731795 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731808 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731821 4906 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731834 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731847 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731859 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731872 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.731885 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.733375 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.734681 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.738409 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.747936 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.748945 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.758972 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.767313 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.778905 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.779899 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.784867 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.805301 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.805482 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.822602 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.833588 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.833707 4906 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.833769 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.833822 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.850951 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.868760 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.879172 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.894884 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.905215 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:05 crc kubenswrapper[4906]: I0221 00:08:05.915366 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.135520 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.135664 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:08:07.135647399 +0000 UTC m=+22.387234905 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.236639 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.236775 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.236835 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.236896 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.236929 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.236940 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.236996 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.236962 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.237045 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:07.237028401 +0000 UTC m=+22.488615907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.237045 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.237060 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:07.237054311 +0000 UTC m=+22.488641817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.237104 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.237145 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.237170 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.237117 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:07.237089052 +0000 UTC m=+22.488676648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:06 crc kubenswrapper[4906]: E0221 00:08:06.237302 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:07.237271547 +0000 UTC m=+22.488859133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.292740 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-21 00:03:05 +0000 UTC, rotation deadline is 2026-12-26 12:46:16.459518277 +0000 UTC Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.292855 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7404h38m10.166665598s for next certificate rotation Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.462342 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 13:36:47.257624977 +0000 UTC Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.718433 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44"} Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.718492 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a8af6bba00f6a2f727a0cce86531ddfe7723f397e9373f3921dbfcbb1b136f89"} Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.720397 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e"} Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.720445 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a"} Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.720456 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4767d4c6d7c2e6bf37258abf0cb336f0490321db7a55954e739d2bd417543ddf"} Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.724959 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e7d5d4b04e24c93d555884c018cc13ff61a5e82542cd3ac21d568bb1175f6e14"} Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.727154 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.728945 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5"} Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.748081 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.793653 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.813061 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.844391 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.870541 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.886123 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.899041 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.915331 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.927005 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.941535 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.953299 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.966144 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.977350 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:06 crc kubenswrapper[4906]: I0221 00:08:06.989742 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.001430 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.012962 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.143775 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:08:09.143754589 +0000 UTC m=+24.395342095 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.143818 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.244251 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.244306 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.244326 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.244347 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244419 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244522 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:09.244499913 +0000 UTC m=+24.496087419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244519 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244567 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244584 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244445 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244654 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:09.244627237 +0000 UTC m=+24.496214823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244639 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244712 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:09.244679048 +0000 UTC m=+24.496266634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244724 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244764 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.244828 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:09.244813722 +0000 UTC m=+24.496401228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.463436 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 16:48:05.001697276 +0000 UTC Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.516921 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.516965 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.517062 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.516943 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.517177 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:07 crc kubenswrapper[4906]: E0221 00:08:07.517271 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.522599 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.523279 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.523937 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.524521 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.525081 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.525572 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.526133 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.526636 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.527273 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.527784 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.528241 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.528886 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.529341 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.529848 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.530328 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.533202 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.534726 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.535522 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.538096 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.539871 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.541331 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.543781 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.544918 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.547342 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.548348 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.552932 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.554362 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.556306 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.558005 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.561927 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.562919 4906 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.563121 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.567964 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.569989 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.570670 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.572938 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.574241 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.574985 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.576340 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.577335 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.578590 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.579400 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.580744 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.581555 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.582668 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.583368 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.584489 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.585487 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.586617 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.587268 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.588430 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.589121 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.589867 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.590953 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.591698 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tfzxw"] Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.592360 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cqkxl"] Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.592526 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hbhzd"] Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.592593 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.592738 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.593354 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-b9qdv"] Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.593540 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hbhzd" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.594212 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bmsd9"] Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.594465 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.597896 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.599819 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.599901 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.599954 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.600202 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.600731 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.602337 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.602515 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.603597 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.603762 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.603836 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.604011 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.604091 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.604457 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.604505 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.604646 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.604656 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.605198 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.605427 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.605597 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.605947 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.607834 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.607879 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.608358 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.619530 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.628105 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.633842 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.643860 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.667116 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.677315 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.693291 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.705510 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.715339 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.731724 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.732963 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.746435 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.747747 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-script-lib\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.747958 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-hostroot\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748030 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-multus-certs\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748070 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-kubelet\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748105 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748164 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cnibin\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748191 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748221 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-socket-dir-parent\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748252 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbk2\" (UniqueName: \"kubernetes.io/projected/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-kube-api-access-plbk2\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748292 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-netns\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748357 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-system-cni-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748390 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-cni-multus\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748439 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-systemd-units\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748488 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23efa997-378b-44cd-9f05-4a80559cd09b-ovn-node-metrics-cert\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748552 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-daemon-config\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748615 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-bin\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748655 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sbct\" (UniqueName: \"kubernetes.io/projected/23efa997-378b-44cd-9f05-4a80559cd09b-kube-api-access-8sbct\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748741 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-etc-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748778 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-log-socket\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748841 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5zp2\" (UniqueName: \"kubernetes.io/projected/f24b0c67-fe8d-4e72-916a-d82306a8b82e-kube-api-access-b5zp2\") pod \"node-resolver-hbhzd\" (UID: \"f24b0c67-fe8d-4e72-916a-d82306a8b82e\") " pod="openshift-dns/node-resolver-hbhzd" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748875 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-var-lib-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748909 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-cni-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748945 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5zr5\" (UniqueName: \"kubernetes.io/projected/17518505-fa81-4399-b6cd-5527dae35ef3-kube-api-access-z5zr5\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.748978 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-system-cni-dir\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749023 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-node-log\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749067 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-config\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749134 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-kubelet\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749200 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-ovn\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749243 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-os-release\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749284 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-os-release\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749323 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749360 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749404 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-env-overrides\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749447 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-cni-bin\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749498 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d15db4e7-a13a-4bd9-8083-1ed09be64a82-cni-binary-copy\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749528 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-netns\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749584 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cni-binary-copy\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749616 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-systemd\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749662 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/17518505-fa81-4399-b6cd-5527dae35ef3-rootfs\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749731 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-netd\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749762 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-slash\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749798 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-cnibin\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749871 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-conf-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749930 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17518505-fa81-4399-b6cd-5527dae35ef3-proxy-tls\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.749965 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.750001 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f24b0c67-fe8d-4e72-916a-d82306a8b82e-hosts-file\") pod \"node-resolver-hbhzd\" (UID: \"f24b0c67-fe8d-4e72-916a-d82306a8b82e\") " pod="openshift-dns/node-resolver-hbhzd" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.750025 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndgv\" (UniqueName: \"kubernetes.io/projected/d15db4e7-a13a-4bd9-8083-1ed09be64a82-kube-api-access-cndgv\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.750084 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17518505-fa81-4399-b6cd-5527dae35ef3-mcd-auth-proxy-config\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.750115 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-k8s-cni-cncf-io\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.750137 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-etc-kubernetes\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.761756 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.774218 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.786325 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.805086 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.815802 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.836996 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.850912 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/17518505-fa81-4399-b6cd-5527dae35ef3-rootfs\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851291 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-netd\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851360 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-netd\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851065 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/17518505-fa81-4399-b6cd-5527dae35ef3-rootfs\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851384 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-cnibin\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851600 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-conf-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851721 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17518505-fa81-4399-b6cd-5527dae35ef3-proxy-tls\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851832 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851913 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f24b0c67-fe8d-4e72-916a-d82306a8b82e-hosts-file\") pod \"node-resolver-hbhzd\" (UID: \"f24b0c67-fe8d-4e72-916a-d82306a8b82e\") " pod="openshift-dns/node-resolver-hbhzd" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851511 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-cnibin\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.852021 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f24b0c67-fe8d-4e72-916a-d82306a8b82e-hosts-file\") pod \"node-resolver-hbhzd\" (UID: \"f24b0c67-fe8d-4e72-916a-d82306a8b82e\") " pod="openshift-dns/node-resolver-hbhzd" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.851734 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-conf-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.852215 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-slash\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.852366 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17518505-fa81-4399-b6cd-5527dae35ef3-mcd-auth-proxy-config\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853231 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-k8s-cni-cncf-io\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853377 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-etc-kubernetes\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853479 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cndgv\" (UniqueName: \"kubernetes.io/projected/d15db4e7-a13a-4bd9-8083-1ed09be64a82-kube-api-access-cndgv\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.852418 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-slash\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853175 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17518505-fa81-4399-b6cd-5527dae35ef3-mcd-auth-proxy-config\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853412 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-etc-kubernetes\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853367 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-k8s-cni-cncf-io\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853554 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-hostroot\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853827 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-multus-certs\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853907 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-kubelet\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853973 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-kubelet\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853913 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-multus-certs\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853831 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-hostroot\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853979 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854012 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.853760 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854200 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-script-lib\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854440 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cnibin\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854492 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854533 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-socket-dir-parent\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854568 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plbk2\" (UniqueName: \"kubernetes.io/projected/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-kube-api-access-plbk2\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854626 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-netns\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854712 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-system-cni-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854769 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-cni-multus\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854832 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-systemd-units\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854881 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-system-cni-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854981 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-cni-multus\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854983 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-netns\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.854888 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23efa997-378b-44cd-9f05-4a80559cd09b-ovn-node-metrics-cert\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855086 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cnibin\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855131 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855106 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-daemon-config\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855169 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-socket-dir-parent\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855204 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sbct\" (UniqueName: \"kubernetes.io/projected/23efa997-378b-44cd-9f05-4a80559cd09b-kube-api-access-8sbct\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855271 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-etc-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855335 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-log-socket\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855394 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-bin\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855443 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-log-socket\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855297 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-systemd-units\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855469 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5zp2\" (UniqueName: \"kubernetes.io/projected/f24b0c67-fe8d-4e72-916a-d82306a8b82e-kube-api-access-b5zp2\") pod \"node-resolver-hbhzd\" (UID: \"f24b0c67-fe8d-4e72-916a-d82306a8b82e\") " pod="openshift-dns/node-resolver-hbhzd" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855514 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-var-lib-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855544 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-cni-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855578 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5zr5\" (UniqueName: \"kubernetes.io/projected/17518505-fa81-4399-b6cd-5527dae35ef3-kube-api-access-z5zr5\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855612 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-system-cni-dir\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855645 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-node-log\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855674 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-config\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855665 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-script-lib\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855769 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-kubelet\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855730 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-kubelet\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855831 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-ovn\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855880 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-os-release\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855908 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-bin\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.855923 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-os-release\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856000 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856065 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856121 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-env-overrides\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856167 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-cni-bin\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856229 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-netns\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856311 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cni-binary-copy\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856369 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-systemd\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856406 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-os-release\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856422 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d15db4e7-a13a-4bd9-8083-1ed09be64a82-cni-binary-copy\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856525 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-cni-dir\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856545 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-node-log\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856661 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-var-lib-cni-bin\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856617 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-system-cni-dir\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.856377 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-var-lib-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857113 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-etc-openvswitch\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857316 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-config\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857444 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-systemd\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857478 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-host-run-netns\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857508 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-ovn\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857550 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-ovn-kubernetes\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857608 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d15db4e7-a13a-4bd9-8083-1ed09be64a82-os-release\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857658 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-env-overrides\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.857838 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.858519 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-cni-binary-copy\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.858537 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d15db4e7-a13a-4bd9-8083-1ed09be64a82-multus-daemon-config\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.858964 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17518505-fa81-4399-b6cd-5527dae35ef3-proxy-tls\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.859366 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.859503 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23efa997-378b-44cd-9f05-4a80559cd09b-ovn-node-metrics-cert\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.860263 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d15db4e7-a13a-4bd9-8083-1ed09be64a82-cni-binary-copy\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.884470 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sbct\" (UniqueName: \"kubernetes.io/projected/23efa997-378b-44cd-9f05-4a80559cd09b-kube-api-access-8sbct\") pod \"ovnkube-node-bmsd9\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.885840 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5zp2\" (UniqueName: \"kubernetes.io/projected/f24b0c67-fe8d-4e72-916a-d82306a8b82e-kube-api-access-b5zp2\") pod \"node-resolver-hbhzd\" (UID: \"f24b0c67-fe8d-4e72-916a-d82306a8b82e\") " pod="openshift-dns/node-resolver-hbhzd" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.887448 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndgv\" (UniqueName: \"kubernetes.io/projected/d15db4e7-a13a-4bd9-8083-1ed09be64a82-kube-api-access-cndgv\") pod \"multus-cqkxl\" (UID: \"d15db4e7-a13a-4bd9-8083-1ed09be64a82\") " pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.888481 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5zr5\" (UniqueName: \"kubernetes.io/projected/17518505-fa81-4399-b6cd-5527dae35ef3-kube-api-access-z5zr5\") pod \"machine-config-daemon-b9qdv\" (UID: \"17518505-fa81-4399-b6cd-5527dae35ef3\") " pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.892304 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.894051 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbk2\" (UniqueName: \"kubernetes.io/projected/43e3c12f-3db0-4bb9-abb2-e78756ad93a5-kube-api-access-plbk2\") pod \"multus-additional-cni-plugins-tfzxw\" (UID: \"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\") " pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.911877 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.912605 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.926821 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hbhzd" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.932304 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.939457 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cqkxl" Feb 21 00:08:07 crc kubenswrapper[4906]: W0221 00:08:07.940894 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e3c12f_3db0_4bb9_abb2_e78756ad93a5.slice/crio-648e36bbe54fdf5c7ac95c05f8f45c0b53953f5d9302b96f97a6f103f28cf1aa WatchSource:0}: Error finding container 648e36bbe54fdf5c7ac95c05f8f45c0b53953f5d9302b96f97a6f103f28cf1aa: Status 404 returned error can't find the container with id 648e36bbe54fdf5c7ac95c05f8f45c0b53953f5d9302b96f97a6f103f28cf1aa Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.956734 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.956714 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:07 crc kubenswrapper[4906]: W0221 00:08:07.960200 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24b0c67_fe8d_4e72_916a_d82306a8b82e.slice/crio-4737d5411babcfcfef476fefd87be5b2716baeb5a45f2cc7fff90aaeff1a0b1b WatchSource:0}: Error finding container 4737d5411babcfcfef476fefd87be5b2716baeb5a45f2cc7fff90aaeff1a0b1b: Status 404 returned error can't find the container with id 4737d5411babcfcfef476fefd87be5b2716baeb5a45f2cc7fff90aaeff1a0b1b Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.967016 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:07 crc kubenswrapper[4906]: I0221 00:08:07.983825 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: W0221 00:08:08.009086 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17518505_fa81_4399_b6cd_5527dae35ef3.slice/crio-460156ab2895a45978893e2f1db9317e4b0a98ace917ff86bbfac268690d75d1 WatchSource:0}: Error finding container 460156ab2895a45978893e2f1db9317e4b0a98ace917ff86bbfac268690d75d1: Status 404 returned error can't find the container with id 460156ab2895a45978893e2f1db9317e4b0a98ace917ff86bbfac268690d75d1 Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.009784 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.029015 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.041590 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.464003 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 03:59:54.482882699 +0000 UTC Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.735955 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hbhzd" event={"ID":"f24b0c67-fe8d-4e72-916a-d82306a8b82e","Type":"ContainerStarted","Data":"eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.736065 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hbhzd" event={"ID":"f24b0c67-fe8d-4e72-916a-d82306a8b82e","Type":"ContainerStarted","Data":"4737d5411babcfcfef476fefd87be5b2716baeb5a45f2cc7fff90aaeff1a0b1b"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.737352 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.738869 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.738924 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.738942 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"460156ab2895a45978893e2f1db9317e4b0a98ace917ff86bbfac268690d75d1"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.739934 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqkxl" event={"ID":"d15db4e7-a13a-4bd9-8083-1ed09be64a82","Type":"ContainerStarted","Data":"1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.739973 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqkxl" event={"ID":"d15db4e7-a13a-4bd9-8083-1ed09be64a82","Type":"ContainerStarted","Data":"4d143d046a6523b82335c388cf8fd2b64d99f3374ab4808caa6ef4cd2e27a358"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.742413 4906 generic.go:334] "Generic (PLEG): container finished" podID="43e3c12f-3db0-4bb9-abb2-e78756ad93a5" containerID="333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290" exitCode=0 Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.742476 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" event={"ID":"43e3c12f-3db0-4bb9-abb2-e78756ad93a5","Type":"ContainerDied","Data":"333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.742512 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" event={"ID":"43e3c12f-3db0-4bb9-abb2-e78756ad93a5","Type":"ContainerStarted","Data":"648e36bbe54fdf5c7ac95c05f8f45c0b53953f5d9302b96f97a6f103f28cf1aa"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.743557 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668" exitCode=0 Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.743607 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.743646 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"b9647615efe8406611f1044cef4c83e89d101c0e78f7f75cf1910d1a11711b91"} Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.753464 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.774070 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.793170 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.816886 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.840744 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.856515 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.879794 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.893765 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.907333 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.929772 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.944916 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.958299 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.974454 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:08 crc kubenswrapper[4906]: I0221 00:08:08.988600 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.006922 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.018520 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.029638 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.041641 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.055084 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.070151 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.088591 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.098451 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.119106 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.134742 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.148155 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.160373 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.173136 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.173313 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:08:13.173294831 +0000 UTC m=+28.424882457 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.179843 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.196397 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.274831 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.274903 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.274926 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.274949 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275107 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275125 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275142 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275204 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:13.275184947 +0000 UTC m=+28.526772453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275262 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275272 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275281 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275303 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:13.27529601 +0000 UTC m=+28.526883516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275353 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275377 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:13.275369332 +0000 UTC m=+28.526956838 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275408 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.275429 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:13.275423663 +0000 UTC m=+28.527011169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.465063 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 18:36:00.429949539 +0000 UTC Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.517003 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.517034 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.517005 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.517127 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.517220 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:09 crc kubenswrapper[4906]: E0221 00:08:09.517269 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.751465 4906 generic.go:334] "Generic (PLEG): container finished" podID="43e3c12f-3db0-4bb9-abb2-e78756ad93a5" containerID="2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578" exitCode=0 Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.751628 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" event={"ID":"43e3c12f-3db0-4bb9-abb2-e78756ad93a5","Type":"ContainerDied","Data":"2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578"} Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.759888 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671"} Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.759966 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006"} Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.760002 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351"} Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.760028 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1"} Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.760054 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa"} Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.781008 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.801886 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.843375 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.866085 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.894099 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.908804 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.937652 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.956876 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.981949 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:09 crc kubenswrapper[4906]: I0221 00:08:09.995963 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:09Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.009857 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.024510 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.038874 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.053298 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.466222 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 02:57:18.287554455 +0000 UTC Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.776078 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf"} Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.780044 4906 generic.go:334] "Generic (PLEG): container finished" podID="43e3c12f-3db0-4bb9-abb2-e78756ad93a5" containerID="902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df" exitCode=0 Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.780102 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" event={"ID":"43e3c12f-3db0-4bb9-abb2-e78756ad93a5","Type":"ContainerDied","Data":"902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df"} Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.806234 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.828722 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.843774 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.858492 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.874489 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.890073 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.904051 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.936850 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.956898 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:10 crc kubenswrapper[4906]: I0221 00:08:10.991727 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.007703 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.034435 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.050189 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.065043 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.468093 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:42:17.661486007 +0000 UTC Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.516827 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.516836 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.517422 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.516888 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.517612 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.517679 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.620272 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.622878 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.622946 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.622965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.623115 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.632486 4906 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.632913 4906 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.634572 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.634631 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.634650 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.634736 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.634755 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:11Z","lastTransitionTime":"2026-02-21T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.651267 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.655760 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.655814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.655824 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.655840 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.655852 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:11Z","lastTransitionTime":"2026-02-21T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.671020 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.675068 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.675101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.675113 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.675128 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.675139 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:11Z","lastTransitionTime":"2026-02-21T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.690186 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.694425 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.694461 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.694472 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.694488 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.694498 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:11Z","lastTransitionTime":"2026-02-21T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.715123 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.720743 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.720805 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.720826 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.720851 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.720872 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:11Z","lastTransitionTime":"2026-02-21T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.739163 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: E0221 00:08:11.739366 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.741141 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.741174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.741185 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.741203 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.741215 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:11Z","lastTransitionTime":"2026-02-21T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.785324 4906 generic.go:334] "Generic (PLEG): container finished" podID="43e3c12f-3db0-4bb9-abb2-e78756ad93a5" containerID="1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2" exitCode=0 Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.785373 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" event={"ID":"43e3c12f-3db0-4bb9-abb2-e78756ad93a5","Type":"ContainerDied","Data":"1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2"} Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.801412 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.815231 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.837142 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.844289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.844348 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.844367 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.844388 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.844407 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:11Z","lastTransitionTime":"2026-02-21T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.854949 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.870309 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.882104 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.892466 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.903729 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.917481 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.930389 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.947857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.948183 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.948329 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.948465 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.948595 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:11Z","lastTransitionTime":"2026-02-21T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.951792 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.963948 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.976493 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:11 crc kubenswrapper[4906]: I0221 00:08:11.989958 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.051908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.051948 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.051956 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.051970 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.051979 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.154098 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.154138 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.154148 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.154162 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.154171 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.256237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.256269 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.256277 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.256289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.256299 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.359450 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.359490 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.359498 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.359511 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.359520 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.462537 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.462611 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.462631 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.462655 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.462672 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.468961 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 23:34:44.245088339 +0000 UTC Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.565542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.565583 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.565594 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.565619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.565633 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.668552 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.668593 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.668605 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.668625 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.668639 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.771911 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.771974 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.771992 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.772019 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.772037 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.793175 4906 generic.go:334] "Generic (PLEG): container finished" podID="43e3c12f-3db0-4bb9-abb2-e78756ad93a5" containerID="ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be" exitCode=0 Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.793228 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" event={"ID":"43e3c12f-3db0-4bb9-abb2-e78756ad93a5","Type":"ContainerDied","Data":"ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.799387 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.827501 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.849215 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.872807 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.874241 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.874270 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.874283 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.874302 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.874313 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.889066 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.910219 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.939732 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.955051 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.972427 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.976858 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.976890 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.976901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.976917 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.976929 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:12Z","lastTransitionTime":"2026-02-21T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:12 crc kubenswrapper[4906]: I0221 00:08:12.987109 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.001094 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.016081 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.033699 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.046818 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.059543 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.079488 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.079538 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.079554 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.079577 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.079593 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.182197 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.182266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.182283 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.182308 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.182326 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.195825 4906 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.262188 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.262402 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:08:21.262365194 +0000 UTC m=+36.513952730 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.284994 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.285055 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.285071 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.285091 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.285104 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.363727 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.363821 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.363877 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.363893 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.363918 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364010 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364088 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364102 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364112 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:21.364025914 +0000 UTC m=+36.615613460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364124 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364136 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364155 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:21.364140967 +0000 UTC m=+36.615728513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364156 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364232 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:21.364213379 +0000 UTC m=+36.615800915 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364163 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.364308 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:21.364295672 +0000 UTC m=+36.615883208 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.388592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.388663 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.388722 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.388756 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.388780 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.470614 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 12:36:48.672692204 +0000 UTC Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.492728 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.492804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.492826 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.492856 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.492879 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.516894 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.516922 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.517075 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.517187 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.517385 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:13 crc kubenswrapper[4906]: E0221 00:08:13.517560 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.595799 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.595833 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.595843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.595860 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.595871 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.698899 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.698950 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.698962 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.698980 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.698992 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.802736 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.802786 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.802796 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.802813 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.802828 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.807621 4906 generic.go:334] "Generic (PLEG): container finished" podID="43e3c12f-3db0-4bb9-abb2-e78756ad93a5" containerID="c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f" exitCode=0 Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.807667 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" event={"ID":"43e3c12f-3db0-4bb9-abb2-e78756ad93a5","Type":"ContainerDied","Data":"c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.833725 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.856045 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.874174 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.891651 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.906919 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.906979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.906996 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.907019 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.907035 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:13Z","lastTransitionTime":"2026-02-21T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.907164 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.932219 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.952757 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.970576 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:13 crc kubenswrapper[4906]: I0221 00:08:13.988941 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.003267 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.010007 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.010056 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.010070 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.010086 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.010099 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.017326 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.034383 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.069973 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.117461 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.117515 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.117527 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.117543 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.117555 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.122492 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.220550 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.220928 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.220940 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.220958 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.220970 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.324323 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.324394 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.324415 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.324445 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.324467 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.426699 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.426726 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.426735 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.426752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.426763 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.471792 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:42:47.849409898 +0000 UTC Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.529774 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.529839 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.529863 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.529886 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.529903 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.631867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.631913 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.631924 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.631943 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.631955 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.736074 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.736128 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.736144 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.736164 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.736178 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.817389 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.817812 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.817867 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.827582 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" event={"ID":"43e3c12f-3db0-4bb9-abb2-e78756ad93a5","Type":"ContainerStarted","Data":"c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.840443 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.840544 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.840566 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.840633 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.840653 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.845937 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.855644 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x4dw6"] Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.856066 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.862146 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.862171 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.864732 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.864866 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.864873 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.865388 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.870269 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.888878 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.903171 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.934286 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.944004 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.944044 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.944057 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.944074 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.944086 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:14Z","lastTransitionTime":"2026-02-21T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.952633 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.980011 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5dbf78b-7aff-48c9-9064-b47deb9527b2-serviceca\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.980077 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w55s\" (UniqueName: \"kubernetes.io/projected/b5dbf78b-7aff-48c9-9064-b47deb9527b2-kube-api-access-6w55s\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.980267 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5dbf78b-7aff-48c9-9064-b47deb9527b2-host\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:14 crc kubenswrapper[4906]: I0221 00:08:14.987278 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.004356 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.024419 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.040774 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.050675 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.050965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.051013 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.051043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.051070 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.062872 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.080773 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5dbf78b-7aff-48c9-9064-b47deb9527b2-serviceca\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.080814 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w55s\" (UniqueName: \"kubernetes.io/projected/b5dbf78b-7aff-48c9-9064-b47deb9527b2-kube-api-access-6w55s\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.080877 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5dbf78b-7aff-48c9-9064-b47deb9527b2-host\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.080947 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5dbf78b-7aff-48c9-9064-b47deb9527b2-host\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.081457 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.082435 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5dbf78b-7aff-48c9-9064-b47deb9527b2-serviceca\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.100679 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.115507 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.115603 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w55s\" (UniqueName: \"kubernetes.io/projected/b5dbf78b-7aff-48c9-9064-b47deb9527b2-kube-api-access-6w55s\") pod \"node-ca-x4dw6\" (UID: \"b5dbf78b-7aff-48c9-9064-b47deb9527b2\") " pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.135043 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.152891 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.154544 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.154599 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.154617 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.154643 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.154660 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.173792 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x4dw6" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.173899 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.193097 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: W0221 00:08:15.194667 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5dbf78b_7aff_48c9_9064_b47deb9527b2.slice/crio-11d48bd3d9467a75fd4c7459e29bbedd54717686bb834017bad352ca1f91dd87 WatchSource:0}: Error finding container 11d48bd3d9467a75fd4c7459e29bbedd54717686bb834017bad352ca1f91dd87: Status 404 returned error can't find the container with id 11d48bd3d9467a75fd4c7459e29bbedd54717686bb834017bad352ca1f91dd87 Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.213787 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.231051 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.247629 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.257406 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.257458 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.257470 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.257488 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.257501 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.267898 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.302261 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.317293 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.330501 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.347717 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.361495 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.361919 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.361962 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.361974 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.361994 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.362007 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.372354 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.386285 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.466969 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.467374 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.467391 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.467413 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.467429 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.472231 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 11:52:00.547176516 +0000 UTC Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.516989 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.517078 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.517097 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:15 crc kubenswrapper[4906]: E0221 00:08:15.517231 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:15 crc kubenswrapper[4906]: E0221 00:08:15.517271 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:15 crc kubenswrapper[4906]: E0221 00:08:15.517436 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.528522 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.538616 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.553594 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.567377 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.570446 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.570486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.570499 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.570515 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.570526 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.595161 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.611411 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.629728 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.654527 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.673279 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.673316 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.673332 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.673354 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.673371 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.673330 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.685602 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.702319 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.715066 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.733936 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.746859 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.761094 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.776308 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.776382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.776400 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.776422 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.776439 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.839907 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x4dw6" event={"ID":"b5dbf78b-7aff-48c9-9064-b47deb9527b2","Type":"ContainerStarted","Data":"5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.839977 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x4dw6" event={"ID":"b5dbf78b-7aff-48c9-9064-b47deb9527b2","Type":"ContainerStarted","Data":"11d48bd3d9467a75fd4c7459e29bbedd54717686bb834017bad352ca1f91dd87"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.839995 4906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.867317 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.879384 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.879555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.879632 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.879738 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.879815 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.888824 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.907305 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.922770 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.941613 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.962760 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.982535 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.982890 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.983246 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.983525 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.983721 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:15Z","lastTransitionTime":"2026-02-21T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:15 crc kubenswrapper[4906]: I0221 00:08:15.990556 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.008466 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.019585 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.031945 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.048712 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.060768 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.060848 4906 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.077335 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.087015 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.087082 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.087096 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.087110 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.087118 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.089847 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.111475 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.189830 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.189868 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.189877 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.189895 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.189904 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.292823 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.292891 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.292910 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.292937 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.292957 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.395788 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.395829 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.395841 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.395859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.395872 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.474766 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 18:27:07.945208111 +0000 UTC Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.498797 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.498838 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.498849 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.498866 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.498878 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.601022 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.601058 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.601068 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.601081 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.601092 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.703337 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.703579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.703709 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.703802 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.703889 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.806077 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.806293 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.806359 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.806436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.806508 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.843534 4906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.909466 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.909525 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.909543 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.909566 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:16 crc kubenswrapper[4906]: I0221 00:08:16.909584 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:16Z","lastTransitionTime":"2026-02-21T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.012339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.012371 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.012380 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.012395 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.012403 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.115014 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.115106 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.115123 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.115147 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.115165 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.218374 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.218409 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.218418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.218432 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.218443 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.321116 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.321191 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.321209 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.321234 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.321254 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.424874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.424935 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.424945 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.424973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.424989 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.475234 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 21:33:05.396232353 +0000 UTC Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.517560 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.517560 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:17 crc kubenswrapper[4906]: E0221 00:08:17.517803 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.517887 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:17 crc kubenswrapper[4906]: E0221 00:08:17.517978 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:17 crc kubenswrapper[4906]: E0221 00:08:17.518037 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.527016 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.527046 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.527054 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.527068 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.527077 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.536768 4906 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.629741 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.629804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.629825 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.629851 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.629869 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.732990 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.733061 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.733079 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.733105 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.733123 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.836725 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.836780 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.836807 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.836827 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.836839 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.850901 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/0.log" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.858213 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6" exitCode=1 Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.858409 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.859909 4906 scope.go:117] "RemoveContainer" containerID="5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.883436 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.901889 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.919442 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.939184 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.939210 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.939220 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.939233 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.939243 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:17Z","lastTransitionTime":"2026-02-21T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.939765 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.959808 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:17 crc kubenswrapper[4906]: I0221 00:08:17.989761 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.012668 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.043815 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:17Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.180477 6160 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.180595 6160 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.180635 6160 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.180794 6160 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.181379 6160 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.181494 6160 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0221 00:08:17.181948 6160 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0221 00:08:17.182053 6160 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.044556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.044585 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.044602 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.044623 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.044640 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.064811 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.097777 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.118191 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.132358 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.149163 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.149209 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.149221 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.149240 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.149252 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.149537 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.179002 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.209536 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.252039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.252076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.252088 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.252107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.252120 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.354235 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.354271 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.354281 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.354294 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.354302 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.456628 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.456664 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.456674 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.456875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.456900 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.475732 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 02:29:24.500107251 +0000 UTC Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.559899 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.559930 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.559939 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.559952 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.559982 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.663412 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.663449 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.663460 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.663478 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.663490 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.767646 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.767754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.767772 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.767788 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.767801 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.867195 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/1.log" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.868355 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/0.log" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.870646 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.870760 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.870789 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.870817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.870842 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.874810 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122" exitCode=1 Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.874874 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.874965 4906 scope.go:117] "RemoveContainer" containerID="5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.876176 4906 scope.go:117] "RemoveContainer" containerID="d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122" Feb 21 00:08:18 crc kubenswrapper[4906]: E0221 00:08:18.876436 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.905349 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.918849 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.930185 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.941773 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.950887 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.960841 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.974611 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.974645 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.974655 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.974668 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.974676 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:18Z","lastTransitionTime":"2026-02-21T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.975265 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.986155 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:18 crc kubenswrapper[4906]: I0221 00:08:18.997971 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.007323 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.018276 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.028152 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.046704 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.057915 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.077805 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.078030 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.078133 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.078270 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.078378 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.078389 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c086405f9c418aadd2b8f9b1cb15daca95cb19501e3701965767c9b1c900db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:17Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.180477 6160 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.180595 6160 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.180635 6160 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.180794 6160 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.181379 6160 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 00:08:17.181494 6160 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0221 00:08:17.181948 6160 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0221 00:08:17.182053 6160 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.181957 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.181990 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.181999 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.182012 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.182021 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.284909 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.284962 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.284975 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.284995 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.285007 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.387561 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.387620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.387638 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.387660 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.387676 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.476872 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 16:03:43.773074208 +0000 UTC Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.511341 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.516751 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.516802 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:19 crc kubenswrapper[4906]: E0221 00:08:19.516938 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.517043 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:19 crc kubenswrapper[4906]: E0221 00:08:19.517164 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:19 crc kubenswrapper[4906]: E0221 00:08:19.517247 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.519054 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.519118 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.519153 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.519178 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.631285 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.631343 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.631358 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.631377 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.631389 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.734375 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.734439 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.734457 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.734484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.734504 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.837658 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.838017 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.838142 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.838275 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.838387 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.882063 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/1.log" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.899206 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.900500 4906 scope.go:117] "RemoveContainer" containerID="d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122" Feb 21 00:08:19 crc kubenswrapper[4906]: E0221 00:08:19.900841 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.922454 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.940856 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.940904 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.940918 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.940936 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.940949 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:19Z","lastTransitionTime":"2026-02-21T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.944583 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.959508 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.975973 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:19 crc kubenswrapper[4906]: I0221 00:08:19.994302 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.012007 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.029029 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.044225 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.044306 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.044318 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.044372 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.044390 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.048338 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.086649 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.110786 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.131970 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.147073 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.147116 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.147130 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.147145 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.147157 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.151935 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.164678 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.177330 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.193455 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.249806 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.249843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.249867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.249883 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.249893 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.351784 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.351847 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.351864 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.351892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.351910 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.454843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.454921 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.454949 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.454980 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.455003 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.477316 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 09:23:29.687338209 +0000 UTC Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.557991 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.558064 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.558079 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.558119 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.558133 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.664051 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.664130 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.664154 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.664210 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.664243 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.766795 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.766868 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.766891 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.766919 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.766938 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.869423 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.869474 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.869486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.869507 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.869518 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.902346 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm"] Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.903177 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.907989 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.908376 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.928784 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.952564 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.968953 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.971402 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.971430 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.971441 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.971458 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:20 crc kubenswrapper[4906]: I0221 00:08:20.971471 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:20Z","lastTransitionTime":"2026-02-21T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.000086 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.021155 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.039537 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.052981 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2e340b2-f60e-4535-b762-294c8685122e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.053033 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2e340b2-f60e-4535-b762-294c8685122e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.053075 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjg99\" (UniqueName: \"kubernetes.io/projected/a2e340b2-f60e-4535-b762-294c8685122e-kube-api-access-jjg99\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.053158 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2e340b2-f60e-4535-b762-294c8685122e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.071167 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.074025 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.074176 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.074309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.074424 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.074517 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.090426 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.108576 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.122980 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.135511 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.150746 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.153754 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjg99\" (UniqueName: \"kubernetes.io/projected/a2e340b2-f60e-4535-b762-294c8685122e-kube-api-access-jjg99\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.153811 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2e340b2-f60e-4535-b762-294c8685122e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.153864 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2e340b2-f60e-4535-b762-294c8685122e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.153909 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2e340b2-f60e-4535-b762-294c8685122e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.154574 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2e340b2-f60e-4535-b762-294c8685122e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.154636 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2e340b2-f60e-4535-b762-294c8685122e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.159760 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2e340b2-f60e-4535-b762-294c8685122e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.163599 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.169350 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjg99\" (UniqueName: \"kubernetes.io/projected/a2e340b2-f60e-4535-b762-294c8685122e-kube-api-access-jjg99\") pod \"ovnkube-control-plane-749d76644c-qqnnm\" (UID: \"a2e340b2-f60e-4535-b762-294c8685122e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.174761 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.176283 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.176315 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.176339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.176353 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.176362 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.184529 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.194771 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.224153 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" Feb 21 00:08:21 crc kubenswrapper[4906]: W0221 00:08:21.239918 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2e340b2_f60e_4535_b762_294c8685122e.slice/crio-42df1925252a23796cc3581fdcf0c4a46f3355e18c5d333f50115228c837c7ab WatchSource:0}: Error finding container 42df1925252a23796cc3581fdcf0c4a46f3355e18c5d333f50115228c837c7ab: Status 404 returned error can't find the container with id 42df1925252a23796cc3581fdcf0c4a46f3355e18c5d333f50115228c837c7ab Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.278282 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.278319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.278329 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.278344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.278353 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.355485 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.355729 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:08:37.3557119 +0000 UTC m=+52.607299406 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.380289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.380354 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.380375 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.380403 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.380421 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.457133 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.457193 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.457228 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.457266 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457308 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457390 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457394 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457431 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457445 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457401 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:37.45737824 +0000 UTC m=+52.708965746 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457394 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457523 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:37.457502423 +0000 UTC m=+52.709089989 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457545 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:37.457537524 +0000 UTC m=+52.709125170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457548 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457567 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.457618 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 00:08:37.457600836 +0000 UTC m=+52.709188402 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.477772 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 22:37:07.429846123 +0000 UTC Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.481950 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.481971 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.481978 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.481990 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.482001 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.516557 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.516707 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.517061 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.517116 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.517243 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.517302 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.586043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.586086 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.586099 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.586117 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.586134 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.643397 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rhw7p"] Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.648120 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.648221 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.666972 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.678021 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.688540 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.688579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.688591 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.688611 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.688625 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.694601 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.706818 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.719667 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.744774 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.758042 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.760307 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46f56\" (UniqueName: \"kubernetes.io/projected/7544a92e-993a-46af-9f26-243f53d1206d-kube-api-access-46f56\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.760344 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.772369 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.785083 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.790973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.791003 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.791012 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.791025 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.791036 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.795860 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.809712 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.821497 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.821547 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.821560 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.821579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.821592 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.822836 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.834194 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.834644 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.837497 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.837537 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.837548 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.837564 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.837574 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.847286 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.849330 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.852661 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.852720 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.852731 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.852748 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.852760 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.857077 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.860759 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46f56\" (UniqueName: \"kubernetes.io/projected/7544a92e-993a-46af-9f26-243f53d1206d-kube-api-access-46f56\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.860808 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.860942 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.861033 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs podName:7544a92e-993a-46af-9f26-243f53d1206d nodeName:}" failed. No retries permitted until 2026-02-21 00:08:22.361016275 +0000 UTC m=+37.612603781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs") pod "network-metrics-daemon-rhw7p" (UID: "7544a92e-993a-46af-9f26-243f53d1206d") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.864204 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.867292 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.867321 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.867330 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.867345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.867355 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.872866 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.876711 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46f56\" (UniqueName: \"kubernetes.io/projected/7544a92e-993a-46af-9f26-243f53d1206d-kube-api-access-46f56\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.879889 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.883618 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.883666 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.883677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.883715 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.883729 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.889024 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.894817 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" event={"ID":"a2e340b2-f60e-4535-b762-294c8685122e","Type":"ContainerStarted","Data":"c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.894876 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" event={"ID":"a2e340b2-f60e-4535-b762-294c8685122e","Type":"ContainerStarted","Data":"69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.894889 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" event={"ID":"a2e340b2-f60e-4535-b762-294c8685122e","Type":"ContainerStarted","Data":"42df1925252a23796cc3581fdcf0c4a46f3355e18c5d333f50115228c837c7ab"} Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.901164 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: E0221 00:08:21.901277 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.902760 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.902797 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.902809 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.902825 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.902838 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:21Z","lastTransitionTime":"2026-02-21T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.909903 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.923030 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.937321 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.951413 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.975922 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:21 crc kubenswrapper[4906]: I0221 00:08:21.989817 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.004081 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.005598 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.005632 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.005643 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.005658 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.005669 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.037661 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.053496 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.068997 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.084115 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.100768 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.108867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.108925 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.108938 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.108956 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.108968 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.116473 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.133376 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.151777 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.164037 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.171000 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.187608 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.206387 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.211594 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.211657 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.211673 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.211717 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.211732 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.225418 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.247084 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.263592 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.295422 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.312152 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.314021 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.314055 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.314069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.314087 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.314099 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.333897 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.359390 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.366510 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:22 crc kubenswrapper[4906]: E0221 00:08:22.366723 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:22 crc kubenswrapper[4906]: E0221 00:08:22.366803 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs podName:7544a92e-993a-46af-9f26-243f53d1206d nodeName:}" failed. No retries permitted until 2026-02-21 00:08:23.366778373 +0000 UTC m=+38.618365919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs") pod "network-metrics-daemon-rhw7p" (UID: "7544a92e-993a-46af-9f26-243f53d1206d") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.379031 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.404041 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.416594 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.416649 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.416665 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.416720 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.416742 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.422072 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.438123 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.455506 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.476918 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.478787 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 18:21:18.576658011 +0000 UTC Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.494081 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.511503 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.518971 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.519035 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.519048 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.519088 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.519103 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.526450 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.622382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.622440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.622454 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.622476 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.622490 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.725106 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.725174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.725188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.725205 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.725216 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.828549 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.828606 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.828622 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.828641 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.828655 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.930852 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.930917 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.930932 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.930952 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:22 crc kubenswrapper[4906]: I0221 00:08:22.930963 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:22Z","lastTransitionTime":"2026-02-21T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.034369 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.034418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.034434 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.034457 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.034476 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.137368 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.137424 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.137440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.137462 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.137480 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.240638 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.240752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.240770 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.241226 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.241287 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.344965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.345043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.345060 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.345567 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.345630 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.376047 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:23 crc kubenswrapper[4906]: E0221 00:08:23.376217 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:23 crc kubenswrapper[4906]: E0221 00:08:23.376317 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs podName:7544a92e-993a-46af-9f26-243f53d1206d nodeName:}" failed. No retries permitted until 2026-02-21 00:08:25.376289753 +0000 UTC m=+40.627877289 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs") pod "network-metrics-daemon-rhw7p" (UID: "7544a92e-993a-46af-9f26-243f53d1206d") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.448768 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.448828 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.448846 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.448870 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.448891 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.479372 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 02:02:40.399440924 +0000 UTC Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.517184 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.517283 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.517281 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.517202 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:23 crc kubenswrapper[4906]: E0221 00:08:23.517476 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:23 crc kubenswrapper[4906]: E0221 00:08:23.517598 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:23 crc kubenswrapper[4906]: E0221 00:08:23.517777 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:23 crc kubenswrapper[4906]: E0221 00:08:23.517839 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.552065 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.552110 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.552121 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.552137 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.552149 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.655119 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.655160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.655171 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.655192 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.655202 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.757974 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.758042 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.758059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.758084 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.758101 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.860552 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.860635 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.860658 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.860717 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.860742 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.964382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.964440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.964456 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.964476 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:23 crc kubenswrapper[4906]: I0221 00:08:23.964489 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:23Z","lastTransitionTime":"2026-02-21T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.067151 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.067219 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.067241 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.067274 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.067297 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.170101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.170144 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.170157 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.170174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.170188 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.274292 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.274367 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.274394 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.274422 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.274441 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.377478 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.377536 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.377555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.377580 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.377597 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.479554 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 07:02:12.987149684 +0000 UTC Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.481282 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.481468 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.481500 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.481538 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.481574 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.585105 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.585206 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.585230 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.585260 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.585287 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.688429 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.688486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.688542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.688568 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.688590 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.792500 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.792570 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.792592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.792620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.792642 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.895615 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.895675 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.895737 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.895773 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.895797 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.998635 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.998730 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.998748 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.998771 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:24 crc kubenswrapper[4906]: I0221 00:08:24.998789 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:24Z","lastTransitionTime":"2026-02-21T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.102146 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.102200 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.102217 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.102239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.102255 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.205542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.205590 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.205612 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.205640 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.205662 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.308523 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.308569 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.308584 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.308602 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.308612 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.399633 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:25 crc kubenswrapper[4906]: E0221 00:08:25.399946 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:25 crc kubenswrapper[4906]: E0221 00:08:25.400104 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs podName:7544a92e-993a-46af-9f26-243f53d1206d nodeName:}" failed. No retries permitted until 2026-02-21 00:08:29.400068323 +0000 UTC m=+44.651655869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs") pod "network-metrics-daemon-rhw7p" (UID: "7544a92e-993a-46af-9f26-243f53d1206d") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.411587 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.411648 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.411709 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.411736 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.411755 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.480479 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 05:24:04.44313228 +0000 UTC Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.516076 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:25 crc kubenswrapper[4906]: E0221 00:08:25.516241 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.516450 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:25 crc kubenswrapper[4906]: E0221 00:08:25.516682 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.516975 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:25 crc kubenswrapper[4906]: E0221 00:08:25.517108 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.517331 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:25 crc kubenswrapper[4906]: E0221 00:08:25.517448 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.517680 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.517725 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.517736 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.517752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.517763 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.536310 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.553714 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.569872 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.588678 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.602138 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.618570 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.620214 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.620262 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.620275 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.620296 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.620310 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.644147 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.657175 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.672222 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.690315 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.704565 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.719707 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.722774 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.722817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.722828 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.722843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.722854 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.736462 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.751092 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.762274 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.774813 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.789122 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:25Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.825932 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.826013 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.826033 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.826056 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.826073 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.929156 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.929223 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.929240 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.929266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:25 crc kubenswrapper[4906]: I0221 00:08:25.929285 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:25Z","lastTransitionTime":"2026-02-21T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.033403 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.033461 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.033479 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.033506 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.033524 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.136708 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.136757 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.136771 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.136788 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.136803 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.239068 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.239123 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.239135 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.239154 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.239166 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.364387 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.364427 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.364437 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.364455 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.364467 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.467206 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.467260 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.467278 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.467301 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.467319 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.481437 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:19:02.796100418 +0000 UTC Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.570553 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.570604 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.570619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.570639 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.570654 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.673952 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.673997 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.674006 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.674021 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.674030 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.777352 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.777402 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.777414 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.777432 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.777447 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.879918 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.879953 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.879961 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.879973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.879982 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.983318 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.983397 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.983729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.983763 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:26 crc kubenswrapper[4906]: I0221 00:08:26.983788 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:26Z","lastTransitionTime":"2026-02-21T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.087088 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.087145 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.087162 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.087187 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.087207 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.189813 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.189878 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.189894 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.189923 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.189941 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.292888 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.292961 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.292990 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.293021 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.293045 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.396452 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.396516 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.396533 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.396567 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.396592 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.481938 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 23:29:26.517557855 +0000 UTC Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.499560 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.499622 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.499656 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.499728 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.499752 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.516084 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.516149 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.516195 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:27 crc kubenswrapper[4906]: E0221 00:08:27.516333 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.516484 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:27 crc kubenswrapper[4906]: E0221 00:08:27.516651 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:27 crc kubenswrapper[4906]: E0221 00:08:27.516886 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:27 crc kubenswrapper[4906]: E0221 00:08:27.517027 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.603274 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.603335 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.603359 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.603389 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.603410 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.706254 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.706309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.706326 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.706348 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.706365 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.809675 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.809794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.809819 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.809848 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.809870 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.912801 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.912886 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.912910 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.912941 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:27 crc kubenswrapper[4906]: I0221 00:08:27.912963 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:27Z","lastTransitionTime":"2026-02-21T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.016068 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.016179 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.016248 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.016276 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.016343 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.119369 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.119439 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.119462 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.119490 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.119512 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.222893 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.222936 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.222958 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.222984 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.223004 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.326268 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.326338 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.326355 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.326417 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.326437 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.429585 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.429674 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.429740 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.429767 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.429784 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.483096 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 03:00:54.342853713 +0000 UTC Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.533233 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.533332 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.533351 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.533375 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.533392 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.638087 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.638130 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.638147 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.638174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.638193 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.741174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.741237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.741260 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.741287 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.741307 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.844775 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.844814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.844824 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.844837 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.844846 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.948269 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.948335 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.948352 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.948379 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:28 crc kubenswrapper[4906]: I0221 00:08:28.948397 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:28Z","lastTransitionTime":"2026-02-21T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.051427 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.051520 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.051539 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.051559 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.051574 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.154205 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.154262 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.154271 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.154284 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.154293 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.256781 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.256833 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.256848 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.256869 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.256885 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.360417 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.360522 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.360547 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.360579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.360604 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.448507 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:29 crc kubenswrapper[4906]: E0221 00:08:29.448744 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:29 crc kubenswrapper[4906]: E0221 00:08:29.448864 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs podName:7544a92e-993a-46af-9f26-243f53d1206d nodeName:}" failed. No retries permitted until 2026-02-21 00:08:37.44883072 +0000 UTC m=+52.700418276 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs") pod "network-metrics-daemon-rhw7p" (UID: "7544a92e-993a-46af-9f26-243f53d1206d") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.463508 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.463574 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.463590 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.463614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.463633 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.483786 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:12:11.211931595 +0000 UTC Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.516983 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.517073 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.517011 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.517012 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:29 crc kubenswrapper[4906]: E0221 00:08:29.517168 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:29 crc kubenswrapper[4906]: E0221 00:08:29.517286 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:29 crc kubenswrapper[4906]: E0221 00:08:29.517439 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:29 crc kubenswrapper[4906]: E0221 00:08:29.517663 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.566970 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.567082 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.567101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.567129 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.567146 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.670334 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.670402 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.670420 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.670444 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.670464 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.773595 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.773676 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.773733 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.773768 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.773792 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.877592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.877670 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.877726 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.877761 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.877783 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.981575 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.981658 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.981682 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.981767 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:29 crc kubenswrapper[4906]: I0221 00:08:29.981790 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:29Z","lastTransitionTime":"2026-02-21T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.084486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.084601 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.084625 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.084654 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.084674 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.187620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.187715 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.187736 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.187759 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.187775 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.290715 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.290771 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.290794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.290817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.290834 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.393608 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.393661 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.393677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.393735 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.393758 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.484258 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 22:04:40.266503654 +0000 UTC Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.497345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.497408 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.497429 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.497456 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.497481 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.600596 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.600679 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.600739 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.600811 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.600829 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.703539 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.703588 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.703601 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.703620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.703632 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.807113 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.807262 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.807292 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.807324 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.807348 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.910785 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.910856 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.910879 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.910904 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:30 crc kubenswrapper[4906]: I0221 00:08:30.910921 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:30Z","lastTransitionTime":"2026-02-21T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.013785 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.013844 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.013862 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.013885 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.013902 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.117036 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.117099 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.117116 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.117139 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.117157 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.220267 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.220307 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.220318 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.220334 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.220343 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.323852 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.323920 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.323937 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.323962 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.323980 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.427521 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.427606 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.427639 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.427671 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.427727 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.502333 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 17:58:33.231513609 +0000 UTC Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.516172 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.516298 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.516427 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.516646 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:31 crc kubenswrapper[4906]: E0221 00:08:31.516641 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:31 crc kubenswrapper[4906]: E0221 00:08:31.516919 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:31 crc kubenswrapper[4906]: E0221 00:08:31.517061 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:31 crc kubenswrapper[4906]: E0221 00:08:31.517203 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.532211 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.532339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.532362 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.532391 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.532414 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.635631 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.635710 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.635720 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.635741 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.635756 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.738725 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.738787 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.738804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.738827 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.738846 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.841331 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.841382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.841399 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.841422 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.841436 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.944382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.944431 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.944443 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.944466 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:31 crc kubenswrapper[4906]: I0221 00:08:31.944482 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:31Z","lastTransitionTime":"2026-02-21T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.047102 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.047164 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.047174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.047201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.047214 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.084144 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.084199 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.084211 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.084235 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.084247 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: E0221 00:08:32.100980 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.106629 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.106749 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.106769 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.106798 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.106822 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: E0221 00:08:32.128608 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.132888 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.132933 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.132948 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.132998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.133016 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: E0221 00:08:32.156433 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.160911 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.161029 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.161051 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.161079 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.161104 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: E0221 00:08:32.181438 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.186614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.186653 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.186662 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.186678 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.186707 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: E0221 00:08:32.202314 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:32 crc kubenswrapper[4906]: E0221 00:08:32.202676 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.204618 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.204648 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.204658 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.204671 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.204698 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.313225 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.313297 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.313319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.313351 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.313376 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.416398 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.416459 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.416476 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.416502 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.416521 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.502761 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 19:25:23.680024721 +0000 UTC Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.517765 4906 scope.go:117] "RemoveContainer" containerID="d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.519468 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.519533 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.519556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.519587 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.519609 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.622863 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.623144 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.623161 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.623184 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.623203 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.725974 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.726021 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.726032 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.726050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.726061 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.828612 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.828711 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.828725 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.828742 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.828755 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.931858 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.931936 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.931963 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.931995 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.932048 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:32Z","lastTransitionTime":"2026-02-21T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.942100 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/1.log" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.945918 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3"} Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.946536 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.965902 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:32 crc kubenswrapper[4906]: I0221 00:08:32.979885 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.000186 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.014699 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.030935 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.034602 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.034648 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.034660 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.034677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.034704 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.043321 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.058826 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.068356 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.079837 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.093766 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.104251 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.114556 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.125437 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.137482 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.137521 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.137532 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.137549 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.137560 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.138854 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.152395 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.169881 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.186719 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.239326 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.239405 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.239427 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.239457 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.239482 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.341510 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.341546 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.341554 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.341568 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.341576 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.443562 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.443602 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.443615 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.443634 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.443647 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.503112 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 09:34:34.59192507 +0000 UTC Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.516673 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.516779 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.516815 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:33 crc kubenswrapper[4906]: E0221 00:08:33.516909 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.516942 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:33 crc kubenswrapper[4906]: E0221 00:08:33.517079 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:33 crc kubenswrapper[4906]: E0221 00:08:33.517191 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:33 crc kubenswrapper[4906]: E0221 00:08:33.517324 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.546111 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.546150 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.546165 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.546185 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.546199 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.649101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.649146 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.649162 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.649186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.649203 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.752556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.752613 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.752623 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.752641 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.752653 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.855384 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.855465 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.855487 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.855518 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.855540 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.952798 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/2.log" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.954057 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/1.log" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.957721 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.957793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.957808 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.957833 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.957854 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:33Z","lastTransitionTime":"2026-02-21T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.958917 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3" exitCode=1 Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.958974 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3"} Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.959039 4906 scope.go:117] "RemoveContainer" containerID="d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.960517 4906 scope.go:117] "RemoveContainer" containerID="2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3" Feb 21 00:08:33 crc kubenswrapper[4906]: E0221 00:08:33.960930 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:08:33 crc kubenswrapper[4906]: I0221 00:08:33.977329 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.001907 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.019823 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.037620 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.058955 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.060534 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.060560 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.060568 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.060583 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.060593 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.078491 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.114316 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.133931 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.151400 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.163479 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.163527 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.163543 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.163569 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.163585 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.172496 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.191185 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.211820 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.226634 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.260914 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f354772803c2489aacf92e608b3d9007e96963731d028ddc3fd972a1002122\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:18Z\\\",\\\"message\\\":\\\"}\\\\nI0221 00:08:18.749167 6317 services_controller.go:454] Service openshift-kube-scheduler/scheduler for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0221 00:08:18.749198 6317 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:18Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:18.749168 6317 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator-machine-webhook]} name:Service_openshift-machine-api/mac\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.266521 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.266593 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.266614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.266639 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.266658 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.278167 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.295843 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.318374 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.368507 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.368549 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.368559 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.368575 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.368586 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.470918 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.470965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.470976 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.471002 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.471018 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.504024 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 10:00:47.134021992 +0000 UTC Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.573986 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.574078 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.574097 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.574123 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.574140 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.676870 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.677148 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.677172 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.677225 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.677251 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.783977 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.784039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.784054 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.784077 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.784095 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.886946 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.887011 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.887029 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.887054 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.887073 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.965200 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/2.log" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.970884 4906 scope.go:117] "RemoveContainer" containerID="2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3" Feb 21 00:08:34 crc kubenswrapper[4906]: E0221 00:08:34.971181 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.989966 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:34Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.991511 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.991669 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.991800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.991896 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:34 crc kubenswrapper[4906]: I0221 00:08:34.992018 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:34Z","lastTransitionTime":"2026-02-21T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.012503 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.030067 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.050205 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.074650 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.094886 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.095387 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.095434 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.095451 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.095474 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.095491 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.118790 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.136732 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.164572 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.183082 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.198499 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.198566 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.198578 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.198619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.198635 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.199913 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.214793 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.230716 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.263063 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.283274 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.301389 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.301432 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.301440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.301456 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.301466 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.304947 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.325448 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.404425 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.404475 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.404490 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.404510 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.404525 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.504871 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:11:30.088956289 +0000 UTC Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.506614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.506729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.506758 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.506792 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.506814 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.516227 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.516315 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.516326 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:35 crc kubenswrapper[4906]: E0221 00:08:35.516449 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.516506 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:35 crc kubenswrapper[4906]: E0221 00:08:35.516568 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:35 crc kubenswrapper[4906]: E0221 00:08:35.516730 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:35 crc kubenswrapper[4906]: E0221 00:08:35.517006 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.538497 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.560007 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.576900 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.590989 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.609350 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.609418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.609432 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.609451 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.609464 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.609822 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.628724 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.653797 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.667282 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.702749 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.712972 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.713044 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.713071 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.713102 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.713125 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.720266 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.736080 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.755787 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.772373 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.791867 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.808549 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.815437 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.815531 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.815603 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.815664 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.815764 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.822477 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.839290 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:35Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.919847 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.919927 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.919948 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.919978 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:35 crc kubenswrapper[4906]: I0221 00:08:35.920000 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:35Z","lastTransitionTime":"2026-02-21T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.023608 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.024063 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.024239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.024453 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.024650 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.127660 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.127748 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.127769 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.127796 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.127819 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.173846 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.185402 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.193627 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.227844 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.231547 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.231626 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.231649 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.231676 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.231728 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.254929 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.269996 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.285111 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.303971 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.319585 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.333651 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.335189 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.335219 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.335231 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.335247 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.335259 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.347608 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.361330 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.373914 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.390082 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.404239 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.418126 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.429067 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.437094 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.437136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.437147 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.437160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.437170 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.448031 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.458596 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:36Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.505777 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 13:50:13.666337537 +0000 UTC Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.540204 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.540268 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.540290 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.540319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.540342 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.643818 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.643921 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.643930 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.643949 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.643959 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.747295 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.747373 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.747391 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.747416 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.747437 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.849630 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.849743 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.849764 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.849788 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.849806 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.952656 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.952737 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.952755 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.952776 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:36 crc kubenswrapper[4906]: I0221 00:08:36.952792 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:36Z","lastTransitionTime":"2026-02-21T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.056222 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.056293 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.056319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.056351 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.056380 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.159039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.159104 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.159117 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.159132 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.159143 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.262408 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.262670 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.262742 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.262762 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.262784 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.366611 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.366728 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.366755 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.366787 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.366810 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.443257 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.443450 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:09:09.443435894 +0000 UTC m=+84.695023400 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.469860 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.469889 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.469899 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.469912 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.469921 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.506328 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 07:45:13.189001333 +0000 UTC Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.516859 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.516948 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.516962 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.517025 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.517226 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.517249 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.517349 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.517526 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.544561 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.544632 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.544678 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.544762 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.544812 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.544927 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.544948 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.544973 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545007 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545033 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.544997 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:09:09.5449744 +0000 UTC m=+84.796561946 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545069 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545112 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs podName:7544a92e-993a-46af-9f26-243f53d1206d nodeName:}" failed. No retries permitted until 2026-02-21 00:08:53.545077843 +0000 UTC m=+68.796665449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs") pod "network-metrics-daemon-rhw7p" (UID: "7544a92e-993a-46af-9f26-243f53d1206d") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545141 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 00:09:09.545124944 +0000 UTC m=+84.796712580 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545183 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:09:09.545155425 +0000 UTC m=+84.796742971 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545457 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545486 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545502 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:37 crc kubenswrapper[4906]: E0221 00:08:37.545568 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 00:09:09.545551796 +0000 UTC m=+84.797139332 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.572828 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.573390 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.573445 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.573477 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.573498 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.676815 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.676861 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.676870 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.676886 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.676899 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.779986 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.780031 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.780043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.780059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.780072 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.884110 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.884185 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.884203 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.884229 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.884248 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.988273 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.988343 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.988367 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.988398 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:37 crc kubenswrapper[4906]: I0221 00:08:37.988421 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:37Z","lastTransitionTime":"2026-02-21T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.091607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.091669 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.091727 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.091753 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.091772 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.193932 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.194002 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.194020 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.194043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.194061 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.295732 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.295786 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.295794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.295808 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.295818 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.397918 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.397999 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.398039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.398057 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.398068 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.501138 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.501198 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.501242 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.501278 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.501302 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.507389 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:09:49.157141088 +0000 UTC Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.614205 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.614276 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.614301 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.614328 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.614347 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.716835 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.716915 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.716939 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.716969 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.716989 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.820482 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.820832 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.820853 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.820876 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.820893 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.923637 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.923741 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.923768 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.923796 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:38 crc kubenswrapper[4906]: I0221 00:08:38.923817 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:38Z","lastTransitionTime":"2026-02-21T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.026453 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.026512 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.026525 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.026547 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.026561 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.129542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.129594 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.129609 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.129634 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.129652 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.233469 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.233531 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.233675 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.233752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.233776 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.337656 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.337790 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.337816 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.337846 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.337872 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.441223 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.441287 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.441305 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.441332 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.441351 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.508072 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 19:25:05.283744896 +0000 UTC Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.516455 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.516610 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.516843 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:39 crc kubenswrapper[4906]: E0221 00:08:39.516823 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.516914 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:39 crc kubenswrapper[4906]: E0221 00:08:39.517014 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:39 crc kubenswrapper[4906]: E0221 00:08:39.517236 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:39 crc kubenswrapper[4906]: E0221 00:08:39.517400 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.545166 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.545250 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.545277 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.545309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.545339 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.648405 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.648447 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.648461 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.648480 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.648494 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.751617 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.751727 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.751766 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.751793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.751813 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.857515 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.857578 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.857589 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.857608 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.857619 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.960174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.960242 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.960276 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.960307 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:39 crc kubenswrapper[4906]: I0221 00:08:39.960329 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:39Z","lastTransitionTime":"2026-02-21T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.063541 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.063775 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.063858 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.063896 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.063969 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.166998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.167058 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.167075 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.167101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.167120 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.270645 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.270784 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.270810 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.270834 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.270852 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.373988 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.374040 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.374052 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.374069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.374082 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.477709 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.477754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.477766 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.477783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.477794 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.508641 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 16:00:58.025901563 +0000 UTC Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.580499 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.580579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.580603 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.580633 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.580654 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.683283 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.683351 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.683369 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.683393 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.683415 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.792223 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.792272 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.792285 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.792304 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.792321 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.895486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.895534 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.895551 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.895572 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.895588 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.997834 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.997897 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.997920 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.997951 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:40 crc kubenswrapper[4906]: I0221 00:08:40.997973 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:40Z","lastTransitionTime":"2026-02-21T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.101076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.101135 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.101152 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.101177 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.101200 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.210147 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.210245 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.210264 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.210319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.210338 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.313436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.313553 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.313571 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.313597 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.313619 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.417315 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.417375 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.417412 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.417453 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.417476 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.509457 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 08:27:59.926698277 +0000 UTC Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.516937 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.517016 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.517037 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:41 crc kubenswrapper[4906]: E0221 00:08:41.517123 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.517188 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:41 crc kubenswrapper[4906]: E0221 00:08:41.517405 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:41 crc kubenswrapper[4906]: E0221 00:08:41.517454 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:41 crc kubenswrapper[4906]: E0221 00:08:41.517553 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.520328 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.520383 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.520399 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.520430 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.520446 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.623636 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.623739 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.623767 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.623798 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.623822 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.726773 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.726839 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.726858 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.726883 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.726899 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.829568 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.829610 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.829619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.829634 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.829645 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.932836 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.932927 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.932948 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.932974 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:41 crc kubenswrapper[4906]: I0221 00:08:41.932993 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:41Z","lastTransitionTime":"2026-02-21T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.036041 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.036111 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.036135 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.036165 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.036187 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.138944 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.138979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.138989 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.139005 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.139016 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.241446 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.241489 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.241502 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.241517 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.241528 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.344709 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.344770 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.344788 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.344817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.344839 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.379302 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.379353 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.379366 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.379385 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.379396 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: E0221 00:08:42.398134 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.403487 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.403539 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.403558 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.403592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.403616 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: E0221 00:08:42.424872 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.430928 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.430975 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.430990 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.431014 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.431035 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: E0221 00:08:42.451524 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.455962 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.455997 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.456009 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.456024 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.456037 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: E0221 00:08:42.469557 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.474331 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.474416 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.474441 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.474473 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.474507 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: E0221 00:08:42.492735 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:42 crc kubenswrapper[4906]: E0221 00:08:42.492838 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.494455 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.494512 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.494523 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.494534 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.494542 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.510315 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 06:46:29.491832287 +0000 UTC Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.597042 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.597089 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.597102 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.597119 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.597131 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.700278 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.700339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.700361 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.700390 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.700414 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.803001 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.803096 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.803114 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.803136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.803152 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.905767 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.905835 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.905851 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.905876 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:42 crc kubenswrapper[4906]: I0221 00:08:42.905895 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:42Z","lastTransitionTime":"2026-02-21T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.008656 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.008711 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.008724 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.008739 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.008750 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.111408 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.111452 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.111463 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.111480 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.111493 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.214462 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.214503 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.214514 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.214529 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.214539 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.316857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.316978 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.317050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.317083 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.317101 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.453959 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.454010 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.454027 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.454051 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.454066 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.511146 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 22:39:17.241834177 +0000 UTC Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.516821 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.516897 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:43 crc kubenswrapper[4906]: E0221 00:08:43.516981 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.516904 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:43 crc kubenswrapper[4906]: E0221 00:08:43.517116 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.517215 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:43 crc kubenswrapper[4906]: E0221 00:08:43.517358 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:43 crc kubenswrapper[4906]: E0221 00:08:43.517477 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.556613 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.556682 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.556730 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.556754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.556773 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.659745 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.659809 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.659831 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.659861 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.659883 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.763788 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.763840 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.763860 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.763887 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.763909 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.867847 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.867928 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.867952 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.867976 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.867996 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.971426 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.971477 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.971498 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.971526 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:43 crc kubenswrapper[4906]: I0221 00:08:43.971549 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:43Z","lastTransitionTime":"2026-02-21T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.074992 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.075058 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.075074 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.075091 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.075104 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.177721 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.177801 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.177835 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.177864 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.177884 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.280973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.281026 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.281039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.281058 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.281071 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.384373 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.384412 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.384420 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.384434 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.384443 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.488094 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.488172 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.488208 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.488239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.488261 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.511522 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 15:35:41.710489335 +0000 UTC Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.590971 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.591046 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.591070 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.591138 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.591162 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.693795 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.693853 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.693875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.693903 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.693927 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.796486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.796545 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.796562 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.796587 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.796604 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.899399 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.899491 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.899512 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.899534 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:44 crc kubenswrapper[4906]: I0221 00:08:44.899549 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:44Z","lastTransitionTime":"2026-02-21T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.002409 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.002467 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.002490 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.002518 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.002540 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.104526 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.104577 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.104593 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.104614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.104630 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.207421 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.207484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.207503 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.207530 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.207546 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.310627 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.310720 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.310745 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.310809 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.310826 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.414647 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.414773 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.414793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.414820 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.414842 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.512254 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 11:18:25.008553696 +0000 UTC Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.516230 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.516369 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:45 crc kubenswrapper[4906]: E0221 00:08:45.516557 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.516598 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.516603 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:45 crc kubenswrapper[4906]: E0221 00:08:45.516821 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:45 crc kubenswrapper[4906]: E0221 00:08:45.516980 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:45 crc kubenswrapper[4906]: E0221 00:08:45.517146 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.517713 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.517743 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.517757 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.517815 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.517830 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.536647 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bf31c61-03f5-48a6-9bc2-3071d6097c5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.736159 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.740735 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.740760 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.740788 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.740802 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.740811 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.771995 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.786464 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.806138 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.847026 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.847112 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.847137 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.847169 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.847191 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.847110 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.884472 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.905079 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.920190 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.931893 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.943896 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.950082 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.950148 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.950160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.950178 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.950189 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:45Z","lastTransitionTime":"2026-02-21T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.954313 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.971756 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.982824 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:45 crc kubenswrapper[4906]: I0221 00:08:45.991880 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.008666 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:46Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.034570 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:46Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.051021 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:46Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.052580 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.052614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.052625 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.052640 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.052650 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.154301 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.154333 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.154386 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.154403 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.154415 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.257336 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.257679 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.257854 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.257998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.258148 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.360736 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.360777 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.360789 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.360804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.360813 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.464180 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.464238 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.464256 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.464293 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.464311 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.513461 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 01:15:52.187620729 +0000 UTC Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.518024 4906 scope.go:117] "RemoveContainer" containerID="2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3" Feb 21 00:08:46 crc kubenswrapper[4906]: E0221 00:08:46.518429 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.589545 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.589622 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.589645 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.589677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.589739 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.692682 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.692780 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.692802 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.692831 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.692851 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.796729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.796806 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.796822 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.796846 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.796861 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.899045 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.899284 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.899397 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.899495 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:46 crc kubenswrapper[4906]: I0221 00:08:46.899561 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:46Z","lastTransitionTime":"2026-02-21T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.003262 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.003316 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.003333 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.003356 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.003373 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.106465 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.106527 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.106552 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.106586 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.106608 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.210053 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.210667 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.210901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.211044 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.211165 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.315348 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.315674 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.315917 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.316069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.316203 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.419930 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.420016 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.420043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.420073 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.420096 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.514142 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 22:20:27.435766466 +0000 UTC Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.516663 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.516824 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:47 crc kubenswrapper[4906]: E0221 00:08:47.516922 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.516664 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:47 crc kubenswrapper[4906]: E0221 00:08:47.517070 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:47 crc kubenswrapper[4906]: E0221 00:08:47.517168 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.517233 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:47 crc kubenswrapper[4906]: E0221 00:08:47.517321 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.523423 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.523483 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.523505 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.523534 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.523556 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.627289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.627345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.627361 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.627380 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.627393 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.730008 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.730073 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.730089 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.730115 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.730132 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.832723 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.832786 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.832803 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.832825 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.832842 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.936412 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.936455 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.936467 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.936484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:47 crc kubenswrapper[4906]: I0221 00:08:47.936496 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:47Z","lastTransitionTime":"2026-02-21T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.039323 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.039743 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.039758 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.039777 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.039791 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.143072 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.143158 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.143181 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.143214 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.143237 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.246980 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.247046 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.247064 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.247089 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.247106 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.350878 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.351046 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.351077 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.351152 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.351181 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.454581 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.454625 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.454645 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.454673 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.454737 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.514783 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 02:29:13.96317587 +0000 UTC Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.558903 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.558946 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.558961 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.558982 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.558998 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.662196 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.662239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.662257 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.662280 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.662296 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.765839 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.765906 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.765929 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.765959 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.765980 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.869516 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.869603 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.869623 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.869649 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.869666 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.972757 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.972841 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.972865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.972893 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:48 crc kubenswrapper[4906]: I0221 00:08:48.972915 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:48Z","lastTransitionTime":"2026-02-21T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.076221 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.076284 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.076317 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.076350 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.076371 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.180032 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.180104 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.180137 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.180166 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.180183 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.283398 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.283468 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.283490 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.283519 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.283540 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.386402 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.386462 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.386486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.386515 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.386538 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.490084 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.490141 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.490175 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.490207 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.490229 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.515574 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 15:29:26.946937153 +0000 UTC Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.516952 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.516955 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.517024 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:49 crc kubenswrapper[4906]: E0221 00:08:49.517133 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:49 crc kubenswrapper[4906]: E0221 00:08:49.517234 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:49 crc kubenswrapper[4906]: E0221 00:08:49.517300 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.517732 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:49 crc kubenswrapper[4906]: E0221 00:08:49.517827 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.592588 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.592639 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.592659 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.592724 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.592782 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.695532 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.695566 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.695576 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.695593 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.695605 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.797784 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.797835 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.797847 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.797865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.797877 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.901268 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.901386 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.901908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.901997 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:49 crc kubenswrapper[4906]: I0221 00:08:49.902261 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:49Z","lastTransitionTime":"2026-02-21T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.005337 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.005408 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.005434 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.005465 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.005488 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.108368 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.108421 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.108442 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.108472 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.108526 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.210957 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.211004 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.211016 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.211035 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.211056 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.314040 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.314107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.314125 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.314152 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.314171 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.417476 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.417545 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.417569 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.417597 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.417619 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.516014 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 04:13:06.995328718 +0000 UTC Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.520460 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.520506 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.520523 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.520544 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.520561 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.623607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.623656 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.623668 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.623701 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.623714 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.726673 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.726718 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.726727 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.726741 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.726750 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.830076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.830140 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.830153 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.830175 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.830190 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.933875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.933939 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.933950 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.933973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:50 crc kubenswrapper[4906]: I0221 00:08:50.933991 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:50Z","lastTransitionTime":"2026-02-21T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.036718 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.036779 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.036800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.036824 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.036842 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.140070 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.140126 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.140141 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.140162 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.140175 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.242671 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.242750 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.242768 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.242794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.242811 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.345196 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.345237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.345249 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.345265 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.345277 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.448197 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.448251 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.448260 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.448280 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.448291 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.516155 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 05:27:45.215894929 +0000 UTC Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.516337 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.516467 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:51 crc kubenswrapper[4906]: E0221 00:08:51.516731 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.516780 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.516822 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:51 crc kubenswrapper[4906]: E0221 00:08:51.516956 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:51 crc kubenswrapper[4906]: E0221 00:08:51.517111 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:51 crc kubenswrapper[4906]: E0221 00:08:51.517247 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.552181 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.552229 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.552239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.552257 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.552272 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.655517 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.655575 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.655589 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.655611 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.655624 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.758659 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.758731 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.758740 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.758754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.758763 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.861047 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.861096 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.861107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.861125 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.861139 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.963133 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.963192 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.963204 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.963223 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:51 crc kubenswrapper[4906]: I0221 00:08:51.963237 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:51Z","lastTransitionTime":"2026-02-21T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.066058 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.066107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.066118 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.066136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.066149 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.168838 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.168890 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.168902 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.168919 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.168929 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.272393 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.272456 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.272472 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.272499 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.272517 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.375890 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.375941 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.375955 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.375975 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.375987 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.480173 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.480244 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.480263 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.480287 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.480303 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.516970 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 13:57:54.106831848 +0000 UTC Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.583886 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.583939 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.583951 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.583983 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.583999 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.683219 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.683307 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.683321 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.683344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.683363 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: E0221 00:08:52.697743 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:52Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.706923 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.706964 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.706980 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.706997 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.707011 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: E0221 00:08:52.720537 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:52Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.724188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.724215 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.724225 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.724239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.724249 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: E0221 00:08:52.739240 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:52Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.743044 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.743095 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.743107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.743123 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.743136 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: E0221 00:08:52.756090 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:52Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.759389 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.759433 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.759444 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.759464 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.759476 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: E0221 00:08:52.770948 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:52Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:52 crc kubenswrapper[4906]: E0221 00:08:52.771118 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.772910 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.772948 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.772958 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.772973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.772982 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.875496 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.875563 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.875577 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.875593 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.875605 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.978433 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.978483 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.978495 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.978513 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:52 crc kubenswrapper[4906]: I0221 00:08:52.978526 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:52Z","lastTransitionTime":"2026-02-21T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.081402 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.081449 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.081461 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.081478 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.081492 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.183922 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.183986 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.184002 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.184026 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.184043 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.287423 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.287479 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.287491 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.287511 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.287523 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.390032 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.390092 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.390112 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.390142 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.390163 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.492622 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.492659 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.492672 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.492711 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.492725 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.517024 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.517083 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.517138 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:53 crc kubenswrapper[4906]: E0221 00:08:53.517154 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.517199 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 17:06:38.628749743 +0000 UTC Feb 21 00:08:53 crc kubenswrapper[4906]: E0221 00:08:53.517249 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:53 crc kubenswrapper[4906]: E0221 00:08:53.517357 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.517382 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:53 crc kubenswrapper[4906]: E0221 00:08:53.517436 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.595396 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.595436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.595447 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.595464 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.595476 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.628252 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:53 crc kubenswrapper[4906]: E0221 00:08:53.628437 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:53 crc kubenswrapper[4906]: E0221 00:08:53.628501 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs podName:7544a92e-993a-46af-9f26-243f53d1206d nodeName:}" failed. No retries permitted until 2026-02-21 00:09:25.628483814 +0000 UTC m=+100.880071320 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs") pod "network-metrics-daemon-rhw7p" (UID: "7544a92e-993a-46af-9f26-243f53d1206d") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.699002 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.699042 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.699050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.699064 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.699073 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.802664 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.802731 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.802767 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.802784 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.802796 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.905143 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.905185 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.905196 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.905211 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:53 crc kubenswrapper[4906]: I0221 00:08:53.905220 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:53Z","lastTransitionTime":"2026-02-21T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.008560 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.008595 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.008604 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.008616 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.008625 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.111398 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.111471 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.111493 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.111524 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.111547 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.215174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.215231 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.215249 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.215339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.215361 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.317718 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.317769 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.317786 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.317815 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.317832 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.420447 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.420482 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.420492 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.420508 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.420518 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.517737 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 02:14:06.933470875 +0000 UTC Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.522794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.522827 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.522860 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.522873 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.522881 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.626042 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.626085 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.626093 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.626110 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.626118 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.729447 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.729509 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.729521 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.729542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.729555 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.832874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.832953 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.832971 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.832997 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.833015 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.935878 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.935942 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.935964 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.935992 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:54 crc kubenswrapper[4906]: I0221 00:08:54.936014 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:54Z","lastTransitionTime":"2026-02-21T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.038139 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/0.log" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.038245 4906 generic.go:334] "Generic (PLEG): container finished" podID="d15db4e7-a13a-4bd9-8083-1ed09be64a82" containerID="1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86" exitCode=1 Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.038297 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqkxl" event={"ID":"d15db4e7-a13a-4bd9-8083-1ed09be64a82","Type":"ContainerDied","Data":"1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.038897 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.038960 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.038973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.038998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.039011 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.039150 4906 scope.go:117] "RemoveContainer" containerID="1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.057099 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.077081 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.092774 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.117055 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.135170 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.142088 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.142160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.142172 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.142190 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.142205 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.148516 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.164464 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:54Z\\\",\\\"message\\\":\\\"2026-02-21T00:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3\\\\n2026-02-21T00:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3 to /host/opt/cni/bin/\\\\n2026-02-21T00:08:09Z [verbose] multus-daemon started\\\\n2026-02-21T00:08:09Z [verbose] Readiness Indicator file check\\\\n2026-02-21T00:08:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.176420 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.206722 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.220902 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.233823 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.245492 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.245535 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.245548 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.245565 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.245576 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.247585 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.262436 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.274483 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.288218 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.299758 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.312439 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bf31c61-03f5-48a6-9bc2-3071d6097c5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.324765 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.347980 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.348026 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.348038 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.348059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.348072 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.450783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.450814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.450824 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.450839 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.450848 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.519397 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:55 crc kubenswrapper[4906]: E0221 00:08:55.519518 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.519800 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:55 crc kubenswrapper[4906]: E0221 00:08:55.519852 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.519893 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:55 crc kubenswrapper[4906]: E0221 00:08:55.519946 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.519988 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:55 crc kubenswrapper[4906]: E0221 00:08:55.520034 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.520186 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 23:46:39.533709501 +0000 UTC Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.533540 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.552607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.552645 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.552654 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.552672 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.552696 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.553873 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:54Z\\\",\\\"message\\\":\\\"2026-02-21T00:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3\\\\n2026-02-21T00:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3 to /host/opt/cni/bin/\\\\n2026-02-21T00:08:09Z [verbose] multus-daemon started\\\\n2026-02-21T00:08:09Z [verbose] Readiness Indicator file check\\\\n2026-02-21T00:08:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.566709 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.588557 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.603507 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.620008 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.634936 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.650867 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.659857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.659896 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.659907 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.659922 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.659932 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.676958 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.692030 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.705663 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.719515 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bf31c61-03f5-48a6-9bc2-3071d6097c5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.734038 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.747718 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.761699 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.761730 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.761739 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.761755 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.761766 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.770197 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.782135 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.801221 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.814531 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:55Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.863817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.863854 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.863863 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.863878 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.863890 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.966732 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.966798 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.966812 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.966838 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:55 crc kubenswrapper[4906]: I0221 00:08:55.966853 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:55Z","lastTransitionTime":"2026-02-21T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.044096 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/0.log" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.044171 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqkxl" event={"ID":"d15db4e7-a13a-4bd9-8083-1ed09be64a82","Type":"ContainerStarted","Data":"3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.058446 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bf31c61-03f5-48a6-9bc2-3071d6097c5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.069499 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.069556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.069570 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.069592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.069607 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.073753 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.097284 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.115216 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.131456 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.148310 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.160225 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.173076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.173135 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.173150 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.173170 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.173185 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.173201 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.185082 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.195426 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.207742 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:54Z\\\",\\\"message\\\":\\\"2026-02-21T00:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3\\\\n2026-02-21T00:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3 to /host/opt/cni/bin/\\\\n2026-02-21T00:08:09Z [verbose] multus-daemon started\\\\n2026-02-21T00:08:09Z [verbose] Readiness Indicator file check\\\\n2026-02-21T00:08:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.217436 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.234672 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.248710 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.272323 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.277405 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.277449 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.277462 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.277484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.277494 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.291325 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.305418 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.318154 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:56Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.380662 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.380735 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.380746 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.380771 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.380797 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.483778 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.483817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.483826 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.483843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.483854 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.521016 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 13:34:05.078115643 +0000 UTC Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.587223 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.587282 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.587295 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.587317 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.587329 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.689892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.689952 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.689970 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.689997 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.690016 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.793312 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.793418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.793436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.793464 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.793480 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.896126 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.896188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.896202 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.896224 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.896237 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:56 crc kubenswrapper[4906]: I0221 00:08:56.999783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:56.999852 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:56.999870 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:56.999905 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:56.999923 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:56Z","lastTransitionTime":"2026-02-21T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.102464 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.102521 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.102535 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.102555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.102571 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.205449 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.205499 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.205509 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.205529 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.205543 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.308816 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.308891 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.308908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.308935 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.308958 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.411865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.411904 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.411915 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.411933 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.411944 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.515228 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.515270 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.515282 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.515301 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.515316 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.516094 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.516124 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.516411 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.516436 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:57 crc kubenswrapper[4906]: E0221 00:08:57.516540 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:57 crc kubenswrapper[4906]: E0221 00:08:57.516649 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:57 crc kubenswrapper[4906]: E0221 00:08:57.516912 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:57 crc kubenswrapper[4906]: E0221 00:08:57.517017 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.517157 4906 scope.go:117] "RemoveContainer" containerID="2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.521138 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 22:43:34.455612758 +0000 UTC Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.619257 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.619309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.619319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.619338 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.619352 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.722160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.722207 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.722219 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.722238 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.722257 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.824718 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.824752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.824761 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.824774 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.824785 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.926763 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.926805 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.926815 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.926833 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:57 crc kubenswrapper[4906]: I0221 00:08:57.926846 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:57Z","lastTransitionTime":"2026-02-21T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.029417 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.029471 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.029484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.029509 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.029522 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.053792 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/2.log" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.056655 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.057075 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.071198 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.089082 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.101133 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.112751 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.132341 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.132388 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.132399 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.132420 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.132431 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.134719 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bf31c61-03f5-48a6-9bc2-3071d6097c5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.152995 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.170312 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.188399 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.197745 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.218769 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.232731 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.234406 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.234435 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.234518 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.234537 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.234548 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.245338 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:54Z\\\",\\\"message\\\":\\\"2026-02-21T00:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3\\\\n2026-02-21T00:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3 to /host/opt/cni/bin/\\\\n2026-02-21T00:08:09Z [verbose] multus-daemon started\\\\n2026-02-21T00:08:09Z [verbose] Readiness Indicator file check\\\\n2026-02-21T00:08:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.258062 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.281392 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.293891 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.310964 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.323507 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.336475 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.336522 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.336538 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.336559 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.336570 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.338087 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.439541 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.439578 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.439588 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.439607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.439616 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.522110 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 23:21:29.394424604 +0000 UTC Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.542588 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.542667 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.542712 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.542741 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.542760 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.646616 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.646709 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.646727 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.646755 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.646774 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.749607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.749667 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.749698 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.749722 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.749736 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.853199 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.853264 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.853281 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.853309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.853329 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.955530 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.955586 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.955599 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.955615 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:58 crc kubenswrapper[4906]: I0221 00:08:58.955624 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:58Z","lastTransitionTime":"2026-02-21T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.057678 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.057734 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.057746 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.057764 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.057780 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.061927 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/3.log" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.062950 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/2.log" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.068672 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8" exitCode=1 Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.068765 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.068816 4906 scope.go:117] "RemoveContainer" containerID="2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.069620 4906 scope.go:117] "RemoveContainer" containerID="8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8" Feb 21 00:08:59 crc kubenswrapper[4906]: E0221 00:08:59.069855 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.087506 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.105972 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.120455 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.137895 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.154565 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bf31c61-03f5-48a6-9bc2-3071d6097c5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.160044 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.160076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.160089 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.160109 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.160123 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.168925 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.182996 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.196903 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.212255 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.223153 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.239888 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ce8337bd884702fc27683799feff2cce24f9ad92151118070c084a04c0193b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:33Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:33Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:33.409954 6529 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:58Z\\\",\\\"message\\\":\\\"after 0 failed attempt(s)\\\\nI0221 00:08:58.407646 6903 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-x4dw6\\\\nI0221 00:08:58.407318 6903 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-bmsd9\\\\nF0221 00:08:58.407644 6903 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:58.407658 6903 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-bmsd9 in node crc\\\\nI0221 00:08:58.407647 6903 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.252782 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.262738 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.262779 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.262791 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.262809 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.262822 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.268128 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.281779 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:54Z\\\",\\\"message\\\":\\\"2026-02-21T00:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3\\\\n2026-02-21T00:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3 to /host/opt/cni/bin/\\\\n2026-02-21T00:08:09Z [verbose] multus-daemon started\\\\n2026-02-21T00:08:09Z [verbose] Readiness Indicator file check\\\\n2026-02-21T00:08:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.293651 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.312643 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.326347 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.339080 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:59Z is after 2025-08-24T17:21:41Z" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.365139 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.365194 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.365207 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.365226 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.365239 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.468571 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.468656 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.468717 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.468756 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.468780 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.516505 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.516543 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.516635 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.516752 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:08:59 crc kubenswrapper[4906]: E0221 00:08:59.516760 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:08:59 crc kubenswrapper[4906]: E0221 00:08:59.516908 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:08:59 crc kubenswrapper[4906]: E0221 00:08:59.517076 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:08:59 crc kubenswrapper[4906]: E0221 00:08:59.517162 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.522506 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 07:25:06.580878391 +0000 UTC Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.571638 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.571698 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.571714 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.571732 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.571744 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.675335 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.675393 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.675406 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.675429 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.675443 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.777955 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.778052 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.778069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.778088 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.778099 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.881065 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.881106 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.881116 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.881133 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.881144 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.983474 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.983551 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.983571 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.983599 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:08:59 crc kubenswrapper[4906]: I0221 00:08:59.983617 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:08:59Z","lastTransitionTime":"2026-02-21T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.074443 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/3.log" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.077878 4906 scope.go:117] "RemoveContainer" containerID="8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8" Feb 21 00:09:00 crc kubenswrapper[4906]: E0221 00:09:00.078197 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.085439 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.085484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.085494 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.085512 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.085527 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.111609 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.130735 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.149742 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.168949 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.186086 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.188432 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.188488 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.188505 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.188527 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.188543 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.204259 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:54Z\\\",\\\"message\\\":\\\"2026-02-21T00:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3\\\\n2026-02-21T00:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3 to /host/opt/cni/bin/\\\\n2026-02-21T00:08:09Z [verbose] multus-daemon started\\\\n2026-02-21T00:08:09Z [verbose] Readiness Indicator file check\\\\n2026-02-21T00:08:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.218093 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.234368 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.251963 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.270455 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.284909 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.290559 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.290588 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.290595 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.290610 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.290619 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.300450 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bf31c61-03f5-48a6-9bc2-3071d6097c5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.315311 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.332155 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.347555 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.361295 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.383039 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:58Z\\\",\\\"message\\\":\\\"after 0 failed attempt(s)\\\\nI0221 00:08:58.407646 6903 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-x4dw6\\\\nI0221 00:08:58.407318 6903 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-bmsd9\\\\nF0221 00:08:58.407644 6903 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:58.407658 6903 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-bmsd9 in node crc\\\\nI0221 00:08:58.407647 6903 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.393068 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.393103 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.393114 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.393131 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.393145 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.398038 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:00Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.495266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.495305 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.495315 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.495330 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.495341 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.522873 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 02:11:42.745086555 +0000 UTC Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.598085 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.598124 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.598133 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.598149 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.598158 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.700910 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.700971 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.700992 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.701022 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.701043 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.803347 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.803401 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.803414 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.803433 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.803473 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.905851 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.905914 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.905933 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.905962 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:00 crc kubenswrapper[4906]: I0221 00:09:00.905980 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:00Z","lastTransitionTime":"2026-02-21T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.008435 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.008507 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.008553 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.008584 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.008603 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.111387 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.111451 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.111467 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.111494 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.111511 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.214875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.214953 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.214973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.215000 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.215017 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.318467 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.318524 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.318540 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.318578 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.318597 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.422124 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.422189 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.422206 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.422230 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.422247 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.517010 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.517070 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.517113 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.517070 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:01 crc kubenswrapper[4906]: E0221 00:09:01.517223 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:01 crc kubenswrapper[4906]: E0221 00:09:01.517381 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:01 crc kubenswrapper[4906]: E0221 00:09:01.517481 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.523148 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 09:17:00.903403683 +0000 UTC Feb 21 00:09:01 crc kubenswrapper[4906]: E0221 00:09:01.524296 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.527311 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.527372 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.527389 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.527414 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.527431 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.629958 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.629998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.630006 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.630023 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.630036 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.732796 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.732859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.732877 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.732911 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.732927 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.836069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.836156 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.836179 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.836213 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.836237 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.940051 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.940127 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.940144 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.940174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:01 crc kubenswrapper[4906]: I0221 00:09:01.940192 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:01Z","lastTransitionTime":"2026-02-21T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.042752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.042802 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.042824 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.042849 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.042900 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.146332 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.146379 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.146387 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.146405 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.146416 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.249191 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.249252 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.249269 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.249292 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.249303 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.352446 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.352518 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.352541 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.352575 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.352600 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.455904 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.456028 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.456046 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.456065 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.456081 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.524639 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 08:41:31.763203633 +0000 UTC Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.559430 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.559496 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.559513 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.559539 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.559561 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.663462 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.663521 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.663535 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.663555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.663569 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.766499 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.766576 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.766589 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.766610 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.766626 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.817122 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.817192 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.817208 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.817234 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.817253 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: E0221 00:09:02.834726 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:02Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.839594 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.839641 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.839660 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.839717 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.839736 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: E0221 00:09:02.859654 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:02Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.865836 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.865895 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.865909 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.865930 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.865944 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: E0221 00:09:02.885496 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:02Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.890487 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.890542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.890555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.890575 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.890590 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: E0221 00:09:02.906832 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:02Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.912094 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.912169 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.912188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.912216 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.912237 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:02 crc kubenswrapper[4906]: E0221 00:09:02.929482 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T00:09:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4d8b0bdc-2182-48d0-bb15-cc57765305f9\\\",\\\"systemUUID\\\":\\\"94310220-1d46-4502-bb0a-b3628ff11479\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:02Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:02 crc kubenswrapper[4906]: E0221 00:09:02.929764 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.931807 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.931857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.931874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.931896 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:02 crc kubenswrapper[4906]: I0221 00:09:02.931911 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:02Z","lastTransitionTime":"2026-02-21T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.035316 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.035377 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.035389 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.035410 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.035426 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.138787 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.138859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.138876 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.138907 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.138926 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.243488 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.243548 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.243565 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.243592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.243609 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.348047 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.348114 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.348126 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.348148 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.348166 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.451616 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.451736 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.451758 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.451789 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.451808 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.517028 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.517101 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.517102 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:03 crc kubenswrapper[4906]: E0221 00:09:03.517206 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.517296 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:03 crc kubenswrapper[4906]: E0221 00:09:03.517422 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:03 crc kubenswrapper[4906]: E0221 00:09:03.517481 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:03 crc kubenswrapper[4906]: E0221 00:09:03.517562 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.525131 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 20:09:12.706797935 +0000 UTC Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.554763 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.554824 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.554842 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.554868 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.554890 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.657772 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.657846 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.657869 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.657899 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.657922 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.762237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.762313 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.762421 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.762459 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.762483 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.864817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.864868 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.864879 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.864897 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.864910 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.968197 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.968253 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.968267 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.968287 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:03 crc kubenswrapper[4906]: I0221 00:09:03.968300 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:03Z","lastTransitionTime":"2026-02-21T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.071136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.071183 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.071197 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.071216 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.071230 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.174303 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.174382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.174465 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.174496 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.174522 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.277620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.277679 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.277735 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.277756 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.277769 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.380405 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.380490 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.380512 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.380542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.380564 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.483148 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.483206 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.483219 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.483240 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.483254 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.526311 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 15:34:10.100380795 +0000 UTC Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.586733 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.586816 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.586843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.586876 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.586902 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.690136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.690205 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.690226 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.690257 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.690280 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.793210 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.793285 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.793308 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.793339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.793366 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.896266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.896312 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.896330 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.896359 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.896375 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.999249 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.999302 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.999320 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.999345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:04 crc kubenswrapper[4906]: I0221 00:09:04.999362 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:04Z","lastTransitionTime":"2026-02-21T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.102269 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.102322 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.102345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.102371 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.102389 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.206230 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.206309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.206329 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.206357 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.206377 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.309237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.309303 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.309349 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.309374 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.309397 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.412549 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.412597 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.412609 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.412628 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.412641 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.515678 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.515780 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.515795 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.515814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.515825 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.516134 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:05 crc kubenswrapper[4906]: E0221 00:09:05.516272 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.516337 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.516378 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.516416 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:05 crc kubenswrapper[4906]: E0221 00:09:05.516577 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:05 crc kubenswrapper[4906]: E0221 00:09:05.516764 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:05 crc kubenswrapper[4906]: E0221 00:09:05.516964 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.526583 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 08:12:38.114774079 +0000 UTC Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.561679 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d011e7-c2df-4b35-baf1-8b0404a8ae51\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686312fec8d89c23b2ef872e891fd0c1ba1279b0a64c834893c2d8431fca05bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bc1cd9f332e30d7fde3e05085076734d742d2bf375b3b5a55ba7a7e42e0a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbe8b3f30d9bff0951ca61694164e4ece35ca5d3562ad3571150b1cd13236c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://255599132ce34bbd49e4298935a18d1294b2055069c606d796d58656a5552a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c798be0178552e289398fd4312d627282ba16f1cd39b2eb7d6e2ba6d4277323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61592e51b1d5e0dfa8a03ec7efde22defa785ad789600ffc79261f8aa74ec635\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb9db6b64db484ef07aa42fec89794a487d9361954b5c743785df23e675f1410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9214fc1ccde32ccf2d63180cd49a2acd72b8e2b5e9f5fe8494533ae7bca039b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.581020 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3359de6c-dfba-4630-b39d-68e056b5d2ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7928eee765ac7aaa7118868638603d627a00b59850dac177c991754fc324122c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3218b20e5cf2be1651a9fbaef483c8cb2bd297449319f559727af1a47661c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f5568f3c4167bda4510df4f94df728f3286e7c8137bb0ac9f6af2c30c4992a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.595848 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.617836 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.618982 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.619055 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.619073 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.619099 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.619118 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.632791 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f1d03697352fb3306d487c9b25ee0f95f98488fe929392eac603103299efd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.648466 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cqkxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15db4e7-a13a-4bd9-8083-1ed09be64a82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:54Z\\\",\\\"message\\\":\\\"2026-02-21T00:08:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3\\\\n2026-02-21T00:08:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d9117141-7de4-47a2-8a38-012c726038a3 to /host/opt/cni/bin/\\\\n2026-02-21T00:08:09Z [verbose] multus-daemon started\\\\n2026-02-21T00:08:09Z [verbose] Readiness Indicator file check\\\\n2026-02-21T00:08:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cndgv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cqkxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.663233 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e340b2-f60e-4535-b762-294c8685122e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69f35601a717fc132b2180c4d3f2ee38faf5fd0649a002278990d44c2d7be3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46b9920d5411c5f15ee73b4448dee3a9d44d8999d7b842c8d50afc1b5c3fc89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjg99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qqnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.679539 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ceaf11f6f4645b302c0c6371902385cc6f8d6bdaa01cedf578765b7d72ca44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.690888 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17518505-fa81-4399-b6cd-5527dae35ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b5adc60c3a9b0129022f0c54757cf20624c76e6757c4595aa5ff3f80d69479f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9qdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.701790 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x4dw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5dbf78b-7aff-48c9-9064-b47deb9527b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0ce9ee6a3dd83591b7395bbbdd306eea6587bd93b0ce540ccf67b61f66077c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6w55s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x4dw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.717522 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8cf02d3-2a07-464d-b75f-8d3ad8374553\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 00:07:59.169352 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 00:07:59.172022 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-365547171/tls.crt::/tmp/serving-cert-365547171/tls.key\\\\\\\"\\\\nI0221 00:08:05.242429 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 00:08:05.251263 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 00:08:05.251291 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 00:08:05.251317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 00:08:05.251325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 00:08:05.269495 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0221 00:08:05.269788 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269909 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 00:08:05.269943 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 00:08:05.269971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 00:08:05.269996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 00:08:05.270022 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0221 00:08:05.269815 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0221 00:08:05.276526 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.721748 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.721826 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.721854 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.721890 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.721934 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.733407 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e655a3c2f3a6223fe12dded10c0c7b2e8b5024914ead29887a06efd4141b670e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c0b66a5fe8b81df7b0edfcef98acab5208abfce4cedc368707c6ce804b9b99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.745274 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bf31c61-03f5-48a6-9bc2-3071d6097c5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d2240fcc84fe2ddd0f38e9418c7ad17b0a19fd0b283a2833790a7ccfeda9d7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef80849abf92993417e85ac3dec5895f800ebf7f3d27bfc052b9bcbefde9115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5930ffb00a4f30eb0905205eee3f562b0b57a27d60f960d1e11a60f16da8aa21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0451d4a6a1426a64e5717a5c6ebb6357d13778bc225c6d9f1aff545928bd572f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:07:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:07:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:07:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.764671 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c12f-3db0-4bb9-abb2-e78756ad93a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8dc1d37c07d67efbc80d224af9d36cba5c59cd0d51c72f41f7f01b1e5fed80a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333eea4fc0b7dff7d963140c7170d2ee90d197bea040509e505721eb8bee2290\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2608953f7b004971889019911261cce184d87b1bfecde300233b01d287a3b578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902f597b4e14e58e048fa007bc8e0b5ace9d8ca311c8bd6cd067262534f042df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1957f1382ad1777d5a1d00194a7628227309b46384cb8c651defa727b43ae5d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4fb1db5b1e384ec2e2ea838ab2c60b5820d7f87f7f620f93aafe6923e964be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16e90521a66d38588658121de864ab2b4e6655d7b35fc96053ea474a0c0294f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plbk2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tfzxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.776506 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hbhzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24b0c67-fe8d-4e72-916a-d82306a8b82e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eec6020976a803c47d16f08a33b17b7b9654b2a9b8dd75a2c6f0a5cde4b2b963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5zp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hbhzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.797026 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23efa997-378b-44cd-9f05-4a80559cd09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T00:08:58Z\\\",\\\"message\\\":\\\"after 0 failed attempt(s)\\\\nI0221 00:08:58.407646 6903 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-x4dw6\\\\nI0221 00:08:58.407318 6903 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-bmsd9\\\\nF0221 00:08:58.407644 6903 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:08:58Z is after 2025-08-24T17:21:41Z]\\\\nI0221 00:08:58.407658 6903 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-bmsd9 in node crc\\\\nI0221 00:08:58.407647 6903 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T00:08:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T00:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sbct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bmsd9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.810575 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7544a92e-993a-46af-9f26-243f53d1206d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46f56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T00:08:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rhw7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.824729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.824773 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.824784 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.824801 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.824812 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.827255 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T00:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T00:09:05Z is after 2025-08-24T17:21:41Z" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.927877 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.927954 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.927972 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.928001 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:05 crc kubenswrapper[4906]: I0221 00:09:05.928020 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:05Z","lastTransitionTime":"2026-02-21T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.031091 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.031150 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.031162 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.031183 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.031195 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.134174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.134259 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.134282 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.134319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.134342 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.238066 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.238136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.238158 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.238190 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.238215 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.341556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.341624 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.341651 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.341714 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.341740 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.445376 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.445432 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.445444 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.445465 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.445479 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.528042 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 05:22:00.401481668 +0000 UTC Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.548135 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.548177 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.548186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.548203 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.548215 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.651064 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.651105 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.651113 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.651128 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.651137 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.754568 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.754629 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.754655 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.754720 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.754746 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.857045 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.857117 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.857144 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.857177 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.857199 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.961680 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.961780 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.961804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.961835 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:06 crc kubenswrapper[4906]: I0221 00:09:06.961853 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:06Z","lastTransitionTime":"2026-02-21T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.065334 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.065406 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.065418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.065438 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.065450 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.169129 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.169179 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.169188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.169205 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.169215 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.272734 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.272794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.272809 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.272831 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.272846 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.376082 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.376145 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.376160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.376184 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.376202 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.479211 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.479272 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.479282 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.479304 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.479317 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.516882 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.516981 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.517079 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:07 crc kubenswrapper[4906]: E0221 00:09:07.517083 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:07 crc kubenswrapper[4906]: E0221 00:09:07.517218 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:07 crc kubenswrapper[4906]: E0221 00:09:07.517315 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.517319 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:07 crc kubenswrapper[4906]: E0221 00:09:07.517437 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.528240 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 20:46:43.027845867 +0000 UTC Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.582532 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.582585 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.582600 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.582620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.582633 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.686006 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.686080 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.686095 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.686117 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.686135 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.789885 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.789924 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.789935 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.789951 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.789960 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.893910 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.893984 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.894001 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.894028 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.894051 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.996717 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.996774 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.996787 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.996807 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:07 crc kubenswrapper[4906]: I0221 00:09:07.996820 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:07Z","lastTransitionTime":"2026-02-21T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.099541 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.099611 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.099624 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.099648 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.099665 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.203336 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.203373 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.203382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.203397 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.203408 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.305757 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.305836 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.305850 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.305870 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.305881 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.409336 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.409408 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.409434 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.409468 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.409491 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.511993 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.512056 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.512081 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.512104 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.512119 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.528390 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 22:34:23.317560323 +0000 UTC Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.615159 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.615225 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.615240 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.615267 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.615284 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.718025 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.718097 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.718116 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.718142 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.718159 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.821957 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.822042 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.822067 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.822105 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.822131 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.924553 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.924606 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.924616 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.924636 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:08 crc kubenswrapper[4906]: I0221 00:09:08.924648 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:08Z","lastTransitionTime":"2026-02-21T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.026956 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.027013 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.027026 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.027041 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.027054 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.129469 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.129817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.129840 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.129872 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.129893 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.232839 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.232900 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.232912 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.232932 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.232950 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.336030 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.336110 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.336132 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.336159 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.336177 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.439150 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.439213 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.439229 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.439248 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.439260 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.516959 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.517104 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.517244 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.517199094 +0000 UTC m=+148.768786610 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.517277 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.517340 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.517476 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.517628 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.517841 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.518011 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.518124 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.529045 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 21:47:17.163946524 +0000 UTC Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.542407 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.542759 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.542857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.542969 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.543058 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.618518 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.618746 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.618966 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619035 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.619004893 +0000 UTC m=+148.870592439 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.619069 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619122 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619195 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619208 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619215 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.619190038 +0000 UTC m=+148.870777604 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619221 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.619125 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619249 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.61924189 +0000 UTC m=+148.870829496 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619319 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619348 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619368 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:09:09 crc kubenswrapper[4906]: E0221 00:09:09.619503 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.619478906 +0000 UTC m=+148.871066472 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.646440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.646524 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.646539 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.646561 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.646576 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.749795 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.749842 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.749852 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.749871 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.749885 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.853138 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.853202 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.853219 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.853244 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.853265 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.956489 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.956550 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.956800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.956822 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:09 crc kubenswrapper[4906]: I0221 00:09:09.956836 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:09Z","lastTransitionTime":"2026-02-21T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.059975 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.060047 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.060060 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.060105 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.060122 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.162754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.162818 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.162847 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.162867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.162878 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.265893 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.265947 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.265961 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.265985 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.265996 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.369201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.369250 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.369262 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.369286 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.369300 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.472289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.472346 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.472356 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.472376 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.472388 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.530166 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 01:20:48.246428985 +0000 UTC Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.575085 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.575131 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.575142 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.575160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.575173 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.677780 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.677853 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.677870 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.677897 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.677915 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.781208 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.781276 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.781290 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.781313 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.781328 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.884673 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.884748 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.884763 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.884782 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.884797 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.987620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.987759 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.987793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.987827 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:10 crc kubenswrapper[4906]: I0221 00:09:10.987851 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:10Z","lastTransitionTime":"2026-02-21T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.090723 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.090798 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.090817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.090835 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.090852 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.194671 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.194755 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.194769 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.194793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.194807 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.297757 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.297814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.297824 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.297844 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.297857 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.409608 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.409715 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.409732 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.409754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.409766 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.513201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.513265 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.513282 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.513310 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.513328 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.516580 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.516607 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.516584 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.516786 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:11 crc kubenswrapper[4906]: E0221 00:09:11.516976 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:11 crc kubenswrapper[4906]: E0221 00:09:11.517226 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:11 crc kubenswrapper[4906]: E0221 00:09:11.517976 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:11 crc kubenswrapper[4906]: E0221 00:09:11.518094 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.520518 4906 scope.go:117] "RemoveContainer" containerID="8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8" Feb 21 00:09:11 crc kubenswrapper[4906]: E0221 00:09:11.520971 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.530284 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 21:27:36.593400768 +0000 UTC Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.616429 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.616504 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.616523 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.616550 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.616572 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.719955 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.720022 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.720039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.720068 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.720086 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.823866 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.823956 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.823979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.824014 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.824037 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.927201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.927245 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.927257 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.927273 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:11 crc kubenswrapper[4906]: I0221 00:09:11.927285 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:11Z","lastTransitionTime":"2026-02-21T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.030286 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.030339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.030352 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.030373 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.030386 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.134606 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.134728 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.134759 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.134792 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.134816 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.238299 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.238356 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.238367 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.238385 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.238398 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.341605 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.341652 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.341668 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.341722 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.341739 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.444680 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.444746 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.444760 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.444785 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.444796 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.530725 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 05:11:44.684910594 +0000 UTC Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.548442 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.548484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.548497 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.548515 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.548527 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.651673 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.651774 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.651791 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.651815 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.651831 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.755097 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.755136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.755144 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.755160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.755170 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.857183 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.857242 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.857261 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.857285 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.857302 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.960033 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.960118 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.960137 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.960209 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.960231 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.966396 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.966461 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.966479 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.966953 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 00:09:12 crc kubenswrapper[4906]: I0221 00:09:12.967003 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T00:09:12Z","lastTransitionTime":"2026-02-21T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.024284 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s"] Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.024854 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.028869 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.031850 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.031963 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.032265 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.076592 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tfzxw" podStartSLOduration=68.076567678 podStartE2EDuration="1m8.076567678s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.064430605 +0000 UTC m=+88.316018161" watchObservedRunningTime="2026-02-21 00:09:13.076567678 +0000 UTC m=+88.328155174" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.096802 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hbhzd" podStartSLOduration=68.096778319 podStartE2EDuration="1m8.096778319s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.077402312 +0000 UTC m=+88.328989838" watchObservedRunningTime="2026-02-21 00:09:13.096778319 +0000 UTC m=+88.348365825" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.136518 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cqkxl" podStartSLOduration=68.136495392 podStartE2EDuration="1m8.136495392s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.123053822 +0000 UTC m=+88.374641328" watchObservedRunningTime="2026-02-21 00:09:13.136495392 +0000 UTC m=+88.388082898" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.137289 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qqnnm" podStartSLOduration=67.137280185 podStartE2EDuration="1m7.137280185s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.137279395 +0000 UTC m=+88.388866901" watchObservedRunningTime="2026-02-21 00:09:13.137280185 +0000 UTC m=+88.388867691" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.163547 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/066da598-dcbe-48a2-bf76-6d6033bb889e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.163865 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066da598-dcbe-48a2-bf76-6d6033bb889e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.164009 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/066da598-dcbe-48a2-bf76-6d6033bb889e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.164463 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/066da598-dcbe-48a2-bf76-6d6033bb889e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.164583 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/066da598-dcbe-48a2-bf76-6d6033bb889e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.176539 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=66.176521504 podStartE2EDuration="1m6.176521504s" podCreationTimestamp="2026-02-21 00:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.162397605 +0000 UTC m=+88.413985121" watchObservedRunningTime="2026-02-21 00:09:13.176521504 +0000 UTC m=+88.428109010" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.196410 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.196380656 podStartE2EDuration="1m8.196380656s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.176969077 +0000 UTC m=+88.428556583" watchObservedRunningTime="2026-02-21 00:09:13.196380656 +0000 UTC m=+88.447968172" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.257517 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.257495984 podStartE2EDuration="1m8.257495984s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.257026641 +0000 UTC m=+88.508614147" watchObservedRunningTime="2026-02-21 00:09:13.257495984 +0000 UTC m=+88.509083500" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.265940 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/066da598-dcbe-48a2-bf76-6d6033bb889e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.266123 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/066da598-dcbe-48a2-bf76-6d6033bb889e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.266244 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/066da598-dcbe-48a2-bf76-6d6033bb889e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.266342 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/066da598-dcbe-48a2-bf76-6d6033bb889e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.266493 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066da598-dcbe-48a2-bf76-6d6033bb889e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.266653 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/066da598-dcbe-48a2-bf76-6d6033bb889e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.266744 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/066da598-dcbe-48a2-bf76-6d6033bb889e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.266897 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/066da598-dcbe-48a2-bf76-6d6033bb889e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.271464 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066da598-dcbe-48a2-bf76-6d6033bb889e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.284293 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podStartSLOduration=68.284272821 podStartE2EDuration="1m8.284272821s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.284085346 +0000 UTC m=+88.535672852" watchObservedRunningTime="2026-02-21 00:09:13.284272821 +0000 UTC m=+88.535860327" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.284587 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/066da598-dcbe-48a2-bf76-6d6033bb889e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7g6s\" (UID: \"066da598-dcbe-48a2-bf76-6d6033bb889e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.297131 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x4dw6" podStartSLOduration=67.297112614 podStartE2EDuration="1m7.297112614s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.296659731 +0000 UTC m=+88.548247237" watchObservedRunningTime="2026-02-21 00:09:13.297112614 +0000 UTC m=+88.548700120" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.306502 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=37.306486069 podStartE2EDuration="37.306486069s" podCreationTimestamp="2026-02-21 00:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:13.305666136 +0000 UTC m=+88.557253642" watchObservedRunningTime="2026-02-21 00:09:13.306486069 +0000 UTC m=+88.558073575" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.340333 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" Feb 21 00:09:13 crc kubenswrapper[4906]: W0221 00:09:13.353987 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod066da598_dcbe_48a2_bf76_6d6033bb889e.slice/crio-7e4f50cb2173cf52a4cbcacbf12ec9a857e15e7bb92be968518a74decc5bcc00 WatchSource:0}: Error finding container 7e4f50cb2173cf52a4cbcacbf12ec9a857e15e7bb92be968518a74decc5bcc00: Status 404 returned error can't find the container with id 7e4f50cb2173cf52a4cbcacbf12ec9a857e15e7bb92be968518a74decc5bcc00 Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.516135 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.516186 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.516242 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.516301 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:13 crc kubenswrapper[4906]: E0221 00:09:13.516290 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:13 crc kubenswrapper[4906]: E0221 00:09:13.516516 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:13 crc kubenswrapper[4906]: E0221 00:09:13.516671 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:13 crc kubenswrapper[4906]: E0221 00:09:13.517010 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.529827 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.531015 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 13:53:07.458040053 +0000 UTC Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.531106 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 21 00:09:13 crc kubenswrapper[4906]: I0221 00:09:13.539545 4906 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 21 00:09:14 crc kubenswrapper[4906]: I0221 00:09:14.124763 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" event={"ID":"066da598-dcbe-48a2-bf76-6d6033bb889e","Type":"ContainerStarted","Data":"4876d522751af4280cab83fc5e9ec0f2215ead8d3d1b3a6089484848118b2362"} Feb 21 00:09:14 crc kubenswrapper[4906]: I0221 00:09:14.124848 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" event={"ID":"066da598-dcbe-48a2-bf76-6d6033bb889e","Type":"ContainerStarted","Data":"7e4f50cb2173cf52a4cbcacbf12ec9a857e15e7bb92be968518a74decc5bcc00"} Feb 21 00:09:14 crc kubenswrapper[4906]: I0221 00:09:14.147826 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7g6s" podStartSLOduration=69.147746416 podStartE2EDuration="1m9.147746416s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:14.147274303 +0000 UTC m=+89.398861809" watchObservedRunningTime="2026-02-21 00:09:14.147746416 +0000 UTC m=+89.399334002" Feb 21 00:09:14 crc kubenswrapper[4906]: I0221 00:09:14.165372 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.165341424 podStartE2EDuration="1.165341424s" podCreationTimestamp="2026-02-21 00:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:14.16521891 +0000 UTC m=+89.416806456" watchObservedRunningTime="2026-02-21 00:09:14.165341424 +0000 UTC m=+89.416928930" Feb 21 00:09:15 crc kubenswrapper[4906]: I0221 00:09:15.516668 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:15 crc kubenswrapper[4906]: I0221 00:09:15.516884 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:15 crc kubenswrapper[4906]: I0221 00:09:15.516998 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:15 crc kubenswrapper[4906]: E0221 00:09:15.518873 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:15 crc kubenswrapper[4906]: I0221 00:09:15.518926 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:15 crc kubenswrapper[4906]: E0221 00:09:15.519065 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:15 crc kubenswrapper[4906]: E0221 00:09:15.519191 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:15 crc kubenswrapper[4906]: E0221 00:09:15.519286 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:17 crc kubenswrapper[4906]: I0221 00:09:17.516518 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:17 crc kubenswrapper[4906]: I0221 00:09:17.516585 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:17 crc kubenswrapper[4906]: I0221 00:09:17.516661 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:17 crc kubenswrapper[4906]: I0221 00:09:17.517176 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:17 crc kubenswrapper[4906]: E0221 00:09:17.517349 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:17 crc kubenswrapper[4906]: E0221 00:09:17.517492 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:17 crc kubenswrapper[4906]: E0221 00:09:17.517607 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:17 crc kubenswrapper[4906]: E0221 00:09:17.517788 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:19 crc kubenswrapper[4906]: I0221 00:09:19.516746 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:19 crc kubenswrapper[4906]: I0221 00:09:19.517183 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:19 crc kubenswrapper[4906]: I0221 00:09:19.517236 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:19 crc kubenswrapper[4906]: I0221 00:09:19.517199 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:19 crc kubenswrapper[4906]: E0221 00:09:19.517382 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:19 crc kubenswrapper[4906]: E0221 00:09:19.517512 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:19 crc kubenswrapper[4906]: E0221 00:09:19.517634 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:19 crc kubenswrapper[4906]: E0221 00:09:19.517183 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:21 crc kubenswrapper[4906]: I0221 00:09:21.516478 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:21 crc kubenswrapper[4906]: I0221 00:09:21.516566 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:21 crc kubenswrapper[4906]: I0221 00:09:21.516598 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:21 crc kubenswrapper[4906]: E0221 00:09:21.517431 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:21 crc kubenswrapper[4906]: E0221 00:09:21.517405 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:21 crc kubenswrapper[4906]: I0221 00:09:21.516612 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:21 crc kubenswrapper[4906]: E0221 00:09:21.517592 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:21 crc kubenswrapper[4906]: E0221 00:09:21.517829 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:23 crc kubenswrapper[4906]: I0221 00:09:23.517163 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:23 crc kubenswrapper[4906]: I0221 00:09:23.517308 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:23 crc kubenswrapper[4906]: I0221 00:09:23.517176 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:23 crc kubenswrapper[4906]: E0221 00:09:23.517425 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:23 crc kubenswrapper[4906]: I0221 00:09:23.517721 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:23 crc kubenswrapper[4906]: E0221 00:09:23.517594 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:23 crc kubenswrapper[4906]: E0221 00:09:23.517843 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:23 crc kubenswrapper[4906]: E0221 00:09:23.517981 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:25 crc kubenswrapper[4906]: I0221 00:09:25.517031 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:25 crc kubenswrapper[4906]: I0221 00:09:25.517045 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:25 crc kubenswrapper[4906]: I0221 00:09:25.517129 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:25 crc kubenswrapper[4906]: E0221 00:09:25.519216 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:25 crc kubenswrapper[4906]: I0221 00:09:25.519264 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:25 crc kubenswrapper[4906]: E0221 00:09:25.519413 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:25 crc kubenswrapper[4906]: E0221 00:09:25.519569 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:25 crc kubenswrapper[4906]: E0221 00:09:25.519682 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:25 crc kubenswrapper[4906]: I0221 00:09:25.724639 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:25 crc kubenswrapper[4906]: E0221 00:09:25.724911 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:09:25 crc kubenswrapper[4906]: E0221 00:09:25.725039 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs podName:7544a92e-993a-46af-9f26-243f53d1206d nodeName:}" failed. No retries permitted until 2026-02-21 00:10:29.724999658 +0000 UTC m=+164.976587164 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs") pod "network-metrics-daemon-rhw7p" (UID: "7544a92e-993a-46af-9f26-243f53d1206d") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 00:09:26 crc kubenswrapper[4906]: I0221 00:09:26.518284 4906 scope.go:117] "RemoveContainer" containerID="8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8" Feb 21 00:09:26 crc kubenswrapper[4906]: E0221 00:09:26.519646 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bmsd9_openshift-ovn-kubernetes(23efa997-378b-44cd-9f05-4a80559cd09b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" Feb 21 00:09:27 crc kubenswrapper[4906]: I0221 00:09:27.516434 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:27 crc kubenswrapper[4906]: I0221 00:09:27.516512 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:27 crc kubenswrapper[4906]: I0221 00:09:27.516446 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:27 crc kubenswrapper[4906]: I0221 00:09:27.517325 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:27 crc kubenswrapper[4906]: E0221 00:09:27.517590 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:27 crc kubenswrapper[4906]: E0221 00:09:27.517976 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:27 crc kubenswrapper[4906]: E0221 00:09:27.518165 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:27 crc kubenswrapper[4906]: E0221 00:09:27.518102 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:29 crc kubenswrapper[4906]: I0221 00:09:29.516845 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:29 crc kubenswrapper[4906]: I0221 00:09:29.516962 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:29 crc kubenswrapper[4906]: E0221 00:09:29.517469 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:29 crc kubenswrapper[4906]: I0221 00:09:29.517032 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:29 crc kubenswrapper[4906]: I0221 00:09:29.516967 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:29 crc kubenswrapper[4906]: E0221 00:09:29.517662 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:29 crc kubenswrapper[4906]: E0221 00:09:29.517864 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:29 crc kubenswrapper[4906]: E0221 00:09:29.518002 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:31 crc kubenswrapper[4906]: I0221 00:09:31.517015 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:31 crc kubenswrapper[4906]: I0221 00:09:31.517065 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:31 crc kubenswrapper[4906]: I0221 00:09:31.517033 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:31 crc kubenswrapper[4906]: I0221 00:09:31.517159 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:31 crc kubenswrapper[4906]: E0221 00:09:31.517339 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:31 crc kubenswrapper[4906]: E0221 00:09:31.517464 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:31 crc kubenswrapper[4906]: E0221 00:09:31.517981 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:31 crc kubenswrapper[4906]: E0221 00:09:31.518102 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:33 crc kubenswrapper[4906]: I0221 00:09:33.516777 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:33 crc kubenswrapper[4906]: I0221 00:09:33.516796 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:33 crc kubenswrapper[4906]: I0221 00:09:33.516949 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:33 crc kubenswrapper[4906]: I0221 00:09:33.517018 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:33 crc kubenswrapper[4906]: E0221 00:09:33.516976 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:33 crc kubenswrapper[4906]: E0221 00:09:33.517126 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:33 crc kubenswrapper[4906]: E0221 00:09:33.517265 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:33 crc kubenswrapper[4906]: E0221 00:09:33.517362 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:35 crc kubenswrapper[4906]: I0221 00:09:35.516114 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:35 crc kubenswrapper[4906]: I0221 00:09:35.516331 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:35 crc kubenswrapper[4906]: E0221 00:09:35.518882 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:35 crc kubenswrapper[4906]: I0221 00:09:35.518912 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:35 crc kubenswrapper[4906]: I0221 00:09:35.519000 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:35 crc kubenswrapper[4906]: E0221 00:09:35.519258 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:35 crc kubenswrapper[4906]: E0221 00:09:35.519340 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:35 crc kubenswrapper[4906]: E0221 00:09:35.519446 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:37 crc kubenswrapper[4906]: I0221 00:09:37.516121 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:37 crc kubenswrapper[4906]: E0221 00:09:37.516306 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:37 crc kubenswrapper[4906]: I0221 00:09:37.516145 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:37 crc kubenswrapper[4906]: E0221 00:09:37.516478 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:37 crc kubenswrapper[4906]: I0221 00:09:37.516955 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:37 crc kubenswrapper[4906]: I0221 00:09:37.517029 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:37 crc kubenswrapper[4906]: E0221 00:09:37.517128 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:37 crc kubenswrapper[4906]: E0221 00:09:37.517319 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:39 crc kubenswrapper[4906]: I0221 00:09:39.516143 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:39 crc kubenswrapper[4906]: I0221 00:09:39.516205 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:39 crc kubenswrapper[4906]: I0221 00:09:39.516240 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:39 crc kubenswrapper[4906]: E0221 00:09:39.516330 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:39 crc kubenswrapper[4906]: I0221 00:09:39.516543 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:39 crc kubenswrapper[4906]: E0221 00:09:39.516514 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:39 crc kubenswrapper[4906]: E0221 00:09:39.516730 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:39 crc kubenswrapper[4906]: E0221 00:09:39.516878 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:40 crc kubenswrapper[4906]: I0221 00:09:40.517952 4906 scope.go:117] "RemoveContainer" containerID="8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.224707 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/1.log" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.225315 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/0.log" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.225380 4906 generic.go:334] "Generic (PLEG): container finished" podID="d15db4e7-a13a-4bd9-8083-1ed09be64a82" containerID="3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495" exitCode=1 Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.225463 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqkxl" event={"ID":"d15db4e7-a13a-4bd9-8083-1ed09be64a82","Type":"ContainerDied","Data":"3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495"} Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.225533 4906 scope.go:117] "RemoveContainer" containerID="1f4f38fff46919386afd729ea3f6437497e71eeeb557d7f2b955d0677b822a86" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.225989 4906 scope.go:117] "RemoveContainer" containerID="3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495" Feb 21 00:09:41 crc kubenswrapper[4906]: E0221 00:09:41.226220 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-cqkxl_openshift-multus(d15db4e7-a13a-4bd9-8083-1ed09be64a82)\"" pod="openshift-multus/multus-cqkxl" podUID="d15db4e7-a13a-4bd9-8083-1ed09be64a82" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.228363 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/3.log" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.232339 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerStarted","Data":"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112"} Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.233465 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.272431 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podStartSLOduration=96.272408717 podStartE2EDuration="1m36.272408717s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:09:41.271039878 +0000 UTC m=+116.522627384" watchObservedRunningTime="2026-02-21 00:09:41.272408717 +0000 UTC m=+116.523996233" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.382196 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rhw7p"] Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.382338 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:41 crc kubenswrapper[4906]: E0221 00:09:41.382438 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.516320 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.516359 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:41 crc kubenswrapper[4906]: E0221 00:09:41.516491 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:41 crc kubenswrapper[4906]: I0221 00:09:41.516515 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:41 crc kubenswrapper[4906]: E0221 00:09:41.516616 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:41 crc kubenswrapper[4906]: E0221 00:09:41.516778 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:42 crc kubenswrapper[4906]: I0221 00:09:42.237385 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/1.log" Feb 21 00:09:43 crc kubenswrapper[4906]: I0221 00:09:43.516980 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:43 crc kubenswrapper[4906]: I0221 00:09:43.517051 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:43 crc kubenswrapper[4906]: I0221 00:09:43.517066 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:43 crc kubenswrapper[4906]: E0221 00:09:43.517152 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:43 crc kubenswrapper[4906]: E0221 00:09:43.517250 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:43 crc kubenswrapper[4906]: E0221 00:09:43.517522 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:43 crc kubenswrapper[4906]: I0221 00:09:43.516987 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:43 crc kubenswrapper[4906]: E0221 00:09:43.517823 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:45 crc kubenswrapper[4906]: I0221 00:09:45.516974 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:45 crc kubenswrapper[4906]: I0221 00:09:45.517128 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:45 crc kubenswrapper[4906]: I0221 00:09:45.517128 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:45 crc kubenswrapper[4906]: I0221 00:09:45.517196 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:45 crc kubenswrapper[4906]: E0221 00:09:45.519576 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:45 crc kubenswrapper[4906]: E0221 00:09:45.519887 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:45 crc kubenswrapper[4906]: E0221 00:09:45.520015 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:45 crc kubenswrapper[4906]: E0221 00:09:45.520188 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:45 crc kubenswrapper[4906]: E0221 00:09:45.550022 4906 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 21 00:09:45 crc kubenswrapper[4906]: E0221 00:09:45.756892 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 21 00:09:47 crc kubenswrapper[4906]: I0221 00:09:47.516519 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:47 crc kubenswrapper[4906]: I0221 00:09:47.516754 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:47 crc kubenswrapper[4906]: E0221 00:09:47.516954 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:47 crc kubenswrapper[4906]: I0221 00:09:47.517213 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:47 crc kubenswrapper[4906]: I0221 00:09:47.517246 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:47 crc kubenswrapper[4906]: E0221 00:09:47.517303 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:47 crc kubenswrapper[4906]: E0221 00:09:47.517464 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:47 crc kubenswrapper[4906]: E0221 00:09:47.517579 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:49 crc kubenswrapper[4906]: I0221 00:09:49.517144 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:49 crc kubenswrapper[4906]: I0221 00:09:49.517223 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:49 crc kubenswrapper[4906]: E0221 00:09:49.517381 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:49 crc kubenswrapper[4906]: I0221 00:09:49.517442 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:49 crc kubenswrapper[4906]: E0221 00:09:49.517638 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:49 crc kubenswrapper[4906]: E0221 00:09:49.517824 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:49 crc kubenswrapper[4906]: I0221 00:09:49.517181 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:49 crc kubenswrapper[4906]: E0221 00:09:49.518079 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:49 crc kubenswrapper[4906]: I0221 00:09:49.922461 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:09:50 crc kubenswrapper[4906]: E0221 00:09:50.758505 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 21 00:09:51 crc kubenswrapper[4906]: I0221 00:09:51.517008 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:51 crc kubenswrapper[4906]: I0221 00:09:51.517115 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:51 crc kubenswrapper[4906]: E0221 00:09:51.517343 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:51 crc kubenswrapper[4906]: E0221 00:09:51.517542 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:51 crc kubenswrapper[4906]: I0221 00:09:51.517100 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:51 crc kubenswrapper[4906]: I0221 00:09:51.517906 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:51 crc kubenswrapper[4906]: E0221 00:09:51.518034 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:51 crc kubenswrapper[4906]: E0221 00:09:51.518245 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:53 crc kubenswrapper[4906]: I0221 00:09:53.516897 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:53 crc kubenswrapper[4906]: E0221 00:09:53.518012 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:53 crc kubenswrapper[4906]: I0221 00:09:53.517048 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:53 crc kubenswrapper[4906]: I0221 00:09:53.517488 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:53 crc kubenswrapper[4906]: I0221 00:09:53.517040 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:53 crc kubenswrapper[4906]: E0221 00:09:53.518342 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:53 crc kubenswrapper[4906]: E0221 00:09:53.518195 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:53 crc kubenswrapper[4906]: E0221 00:09:53.518459 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:55 crc kubenswrapper[4906]: I0221 00:09:55.516747 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:55 crc kubenswrapper[4906]: E0221 00:09:55.518742 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:55 crc kubenswrapper[4906]: I0221 00:09:55.518805 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:55 crc kubenswrapper[4906]: I0221 00:09:55.518842 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:55 crc kubenswrapper[4906]: E0221 00:09:55.519069 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:55 crc kubenswrapper[4906]: I0221 00:09:55.519176 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:55 crc kubenswrapper[4906]: E0221 00:09:55.519287 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:55 crc kubenswrapper[4906]: E0221 00:09:55.519413 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:55 crc kubenswrapper[4906]: E0221 00:09:55.759972 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 21 00:09:56 crc kubenswrapper[4906]: I0221 00:09:56.517091 4906 scope.go:117] "RemoveContainer" containerID="3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495" Feb 21 00:09:57 crc kubenswrapper[4906]: I0221 00:09:57.290474 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/1.log" Feb 21 00:09:57 crc kubenswrapper[4906]: I0221 00:09:57.290994 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqkxl" event={"ID":"d15db4e7-a13a-4bd9-8083-1ed09be64a82","Type":"ContainerStarted","Data":"02d81b2a66d5bb33ec6a29de8fb43e80fcad7b637e052f4f4f0dd20e1c091ab4"} Feb 21 00:09:57 crc kubenswrapper[4906]: I0221 00:09:57.516496 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:57 crc kubenswrapper[4906]: E0221 00:09:57.516646 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:09:57 crc kubenswrapper[4906]: I0221 00:09:57.516492 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:57 crc kubenswrapper[4906]: I0221 00:09:57.516816 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:57 crc kubenswrapper[4906]: E0221 00:09:57.516957 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:57 crc kubenswrapper[4906]: I0221 00:09:57.517161 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:57 crc kubenswrapper[4906]: E0221 00:09:57.517179 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:57 crc kubenswrapper[4906]: E0221 00:09:57.517401 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:59 crc kubenswrapper[4906]: I0221 00:09:59.516105 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:09:59 crc kubenswrapper[4906]: I0221 00:09:59.516179 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:09:59 crc kubenswrapper[4906]: I0221 00:09:59.516099 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:09:59 crc kubenswrapper[4906]: E0221 00:09:59.516343 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 00:09:59 crc kubenswrapper[4906]: I0221 00:09:59.516339 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:09:59 crc kubenswrapper[4906]: E0221 00:09:59.516573 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 00:09:59 crc kubenswrapper[4906]: E0221 00:09:59.516673 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhw7p" podUID="7544a92e-993a-46af-9f26-243f53d1206d" Feb 21 00:09:59 crc kubenswrapper[4906]: E0221 00:09:59.516795 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.516072 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.516173 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.516090 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.516264 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.522495 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.522860 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.523347 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.528125 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.528535 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 21 00:10:01 crc kubenswrapper[4906]: I0221 00:10:01.533356 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.857342 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.898801 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.899638 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6578b"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.899838 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.900895 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bp67z"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.901310 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.901877 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.904677 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.906024 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.907946 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bbxgp"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.908617 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.913848 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.914547 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.915881 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.915899 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.916065 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.920239 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.920782 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.921269 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.921449 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.921515 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.921650 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.921762 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.922072 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.922239 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.922887 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.930239 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.930491 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.930890 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.930942 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.931753 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.931994 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.933032 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nbpnr"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.934357 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.941052 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.948518 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.948552 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.949495 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-86b7j"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.950402 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.953121 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.953339 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.953493 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.953641 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.953710 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.953831 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.953888 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.954803 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.954906 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.955241 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.955608 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.959670 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.960323 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.960456 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.960534 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.960725 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.962495 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.962582 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.962711 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.962758 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.962789 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.962715 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.963250 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.964975 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.965730 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.965813 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.965884 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.965948 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.966027 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.966374 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.966451 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.966516 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.967037 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.967365 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.967396 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.967589 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.967616 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.967763 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.967780 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.967929 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.968118 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.968286 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.968417 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-84vm6"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.969005 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.969211 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.969896 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.969996 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-84vm6" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.970161 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.970602 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2xzp4"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.970996 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.971742 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.976181 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.980327 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.980466 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fqmc8"] Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.981259 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982582 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4fkr\" (UniqueName: \"kubernetes.io/projected/e4fc2e33-56c9-440b-a7cc-ea9982d47658-kube-api-access-n4fkr\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982665 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982714 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/693c1181-a8ca-4d12-8136-8bfad07df623-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982741 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4fc2e33-56c9-440b-a7cc-ea9982d47658-metrics-tls\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982767 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4fc2e33-56c9-440b-a7cc-ea9982d47658-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982789 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mv5s\" (UniqueName: \"kubernetes.io/projected/693c1181-a8ca-4d12-8136-8bfad07df623-kube-api-access-5mv5s\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982810 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls2rr\" (UniqueName: \"kubernetes.io/projected/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-kube-api-access-ls2rr\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982830 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e4fc2e33-56c9-440b-a7cc-ea9982d47658-trusted-ca\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982856 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-config\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982873 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-client-ca\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982928 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/693c1181-a8ca-4d12-8136-8bfad07df623-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982956 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/693c1181-a8ca-4d12-8136-8bfad07df623-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982979 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-serving-cert\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982601 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982704 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.983343 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982846 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982878 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.983508 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982906 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982935 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.983617 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.982963 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.994565 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.996308 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.996410 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.996744 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.996827 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.997047 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 21 00:10:03 crc kubenswrapper[4906]: I0221 00:10:03.998374 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.036050 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.036405 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.042865 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6v9gm"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.043546 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.044065 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.044546 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.045273 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.045518 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.046537 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.047090 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.048941 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.049317 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.049501 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.049531 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.050454 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.051155 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.052307 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.052229 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.052811 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.053020 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.053236 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.058153 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fm5j7"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.058244 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.059302 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.067718 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.068263 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.067737 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.069282 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wxwm4"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.069937 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.071540 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.072036 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.072193 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.072353 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.072419 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.072468 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.072573 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.073166 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.074313 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-bzr2z"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.074953 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.075279 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.075935 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.077740 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rj6xk"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.080953 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.081468 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.081846 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084260 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-serving-cert\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084298 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53b67334-e090-4527-831d-e36d70482003-etcd-client\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084316 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-serving-cert\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084331 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-config\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084346 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084366 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4fkr\" (UniqueName: \"kubernetes.io/projected/e4fc2e33-56c9-440b-a7cc-ea9982d47658-kube-api-access-n4fkr\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084382 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-etcd-client\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084396 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084410 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-config\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084424 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/beecb71f-3791-44c8-bee4-83585ee82c14-console-oauth-config\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084439 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-oauth-serving-cert\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084463 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-audit-policies\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084480 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-tls\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084495 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-certificates\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084517 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cdadab89-f0cf-4bd7-af7e-17c67a65688a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084531 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ac75298-9ac5-4b16-8110-455c43d00945-metrics-tls\") pod \"dns-operator-744455d44c-fqmc8\" (UID: \"4ac75298-9ac5-4b16-8110-455c43d00945\") " pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084547 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6485254f-aeb0-4df4-ba6e-c9eaa0718933-audit-dir\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084564 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-console-config\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084579 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-service-ca\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084601 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-audit\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084625 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084651 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084674 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5bc1e348-33c5-4bfa-984f-312b58bff4cd-images\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084716 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5bc1e348-33c5-4bfa-984f-312b58bff4cd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084734 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ff05c62-73da-4640-8c3f-8b846c14296c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vmpkj\" (UID: \"6ff05c62-73da-4640-8c3f-8b846c14296c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084750 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clft5\" (UniqueName: \"kubernetes.io/projected/beecb71f-3791-44c8-bee4-83585ee82c14-kube-api-access-clft5\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084767 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/693c1181-a8ca-4d12-8136-8bfad07df623-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084783 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2rpk\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-kube-api-access-m2rpk\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084799 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-etcd-ca\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084812 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084826 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdnd8\" (UniqueName: \"kubernetes.io/projected/51c6b6f9-3b1f-4028-acc1-dac92556b401-kube-api-access-vdnd8\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084841 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4fc2e33-56c9-440b-a7cc-ea9982d47658-metrics-tls\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084856 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4fc2e33-56c9-440b-a7cc-ea9982d47658-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084865 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084875 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cdadab89-f0cf-4bd7-af7e-17c67a65688a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084892 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084908 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc1e348-33c5-4bfa-984f-312b58bff4cd-config\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084924 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mv5s\" (UniqueName: \"kubernetes.io/projected/693c1181-a8ca-4d12-8136-8bfad07df623-kube-api-access-5mv5s\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084940 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls2rr\" (UniqueName: \"kubernetes.io/projected/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-kube-api-access-ls2rr\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084955 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e4fc2e33-56c9-440b-a7cc-ea9982d47658-trusted-ca\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084971 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-bound-sa-token\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.084987 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-etcd-client\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085004 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chh8p\" (UniqueName: \"kubernetes.io/projected/ee0bafe0-cacc-4996-9675-a87c77f3984b-kube-api-access-chh8p\") pod \"package-server-manager-789f6589d5-667sz\" (UID: \"ee0bafe0-cacc-4996-9675-a87c77f3984b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085020 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-encryption-config\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085035 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm2jj\" (UniqueName: \"kubernetes.io/projected/4145877d-bf92-4ebc-8552-df3e4680eaf5-kube-api-access-rm2jj\") pod \"downloads-7954f5f757-84vm6\" (UID: \"4145877d-bf92-4ebc-8552-df3e4680eaf5\") " pod="openshift-console/downloads-7954f5f757-84vm6" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085051 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-config\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085067 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-etcd-service-ca\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085086 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-image-import-ca\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085102 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d315ba-d07b-4043-ba77-bf8859d16638-serving-cert\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085119 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-config\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085134 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-client-ca\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085160 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b67334-e090-4527-831d-e36d70482003-serving-cert\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085176 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-serving-cert\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085193 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhq5z\" (UniqueName: \"kubernetes.io/projected/6ff05c62-73da-4640-8c3f-8b846c14296c-kube-api-access-fhq5z\") pod \"cluster-samples-operator-665b6dd947-vmpkj\" (UID: \"6ff05c62-73da-4640-8c3f-8b846c14296c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085209 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-trusted-ca-bundle\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.085558 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.086258 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.086260 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.086454 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.087705 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-config\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.087894 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088068 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e4fc2e33-56c9-440b-a7cc-ea9982d47658-trusted-ca\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088212 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hzf7\" (UniqueName: \"kubernetes.io/projected/6485254f-aeb0-4df4-ba6e-c9eaa0718933-kube-api-access-2hzf7\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088263 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0bafe0-cacc-4996-9675-a87c77f3984b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-667sz\" (UID: \"ee0bafe0-cacc-4996-9675-a87c77f3984b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088305 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jnz\" (UniqueName: \"kubernetes.io/projected/e0d315ba-d07b-4043-ba77-bf8859d16638-kube-api-access-k2jnz\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088333 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-client-ca\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088382 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtv8\" (UniqueName: \"kubernetes.io/projected/53b67334-e090-4527-831d-e36d70482003-kube-api-access-zgtv8\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088414 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6485254f-aeb0-4df4-ba6e-c9eaa0718933-node-pullsecrets\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088437 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-config\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088464 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-etcd-serving-ca\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088470 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/693c1181-a8ca-4d12-8136-8bfad07df623-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088485 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9q2t\" (UniqueName: \"kubernetes.io/projected/4ac75298-9ac5-4b16-8110-455c43d00945-kube-api-access-w9q2t\") pod \"dns-operator-744455d44c-fqmc8\" (UID: \"4ac75298-9ac5-4b16-8110-455c43d00945\") " pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088570 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/51c6b6f9-3b1f-4028-acc1-dac92556b401-audit-dir\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088606 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/beecb71f-3791-44c8-bee4-83585ee82c14-console-serving-cert\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088669 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088736 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dklbx\" (UniqueName: \"kubernetes.io/projected/5bc1e348-33c5-4bfa-984f-312b58bff4cd-kube-api-access-dklbx\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088771 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-trusted-ca\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088866 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-encryption-config\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088935 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/693c1181-a8ca-4d12-8136-8bfad07df623-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.088978 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/693c1181-a8ca-4d12-8136-8bfad07df623-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.088996 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:04.588980286 +0000 UTC m=+139.840567862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.087439 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-client-ca\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.095498 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.096237 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.097387 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.098764 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-serving-cert\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.100805 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9kg2j"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.101334 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4fc2e33-56c9-440b-a7cc-ea9982d47658-metrics-tls\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.101364 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mjrk7"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.101798 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.102054 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.102462 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.102673 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.103126 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.104376 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.105171 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.107243 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29527200-5vtx4"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.107817 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.109312 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.109778 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.111710 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bbxgp"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.113274 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/693c1181-a8ca-4d12-8136-8bfad07df623-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.113285 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.122400 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.122502 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6578b"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.122515 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.122584 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.122422 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.123354 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.123431 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.125905 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-p5j2t"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.126991 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.127578 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bp67z"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.127600 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.127803 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.132022 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.132050 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.135319 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.137457 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nbpnr"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.140202 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.143621 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.146284 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.148239 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wxwm4"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.149178 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.149742 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.152268 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.153500 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.154725 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.155796 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-84vm6"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.156878 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rj6xk"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.158122 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-86b7j"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.159424 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.160532 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.161897 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2xzp4"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.162217 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.163401 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.164610 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gqmhh"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.165396 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.166023 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x64jb"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.167855 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6v9gm"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.167974 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.168503 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fm5j7"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.169745 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.170995 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29527200-5vtx4"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.172601 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mjrk7"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.173784 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fqmc8"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.174909 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.175999 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.177149 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.178336 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.179622 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gqmhh"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.180832 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.181957 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9kg2j"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.182160 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.184364 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.185514 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x64jb"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.186606 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8l88x"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.187276 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8l88x" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.187847 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8l88x"] Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.189625 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.189829 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:04.689810049 +0000 UTC m=+139.941397555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190188 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190256 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190294 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/beecb71f-3791-44c8-bee4-83585ee82c14-console-oauth-config\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190359 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f900c05-f074-4cf2-b045-8487fb6b95fc-config\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190393 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83a888c-071a-44b4-be7a-e1e2d2bb638e-config\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190450 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190482 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-metrics-certs\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190806 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpc5q\" (UniqueName: \"kubernetes.io/projected/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-kube-api-access-wpc5q\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190865 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxlqr\" (UniqueName: \"kubernetes.io/projected/ba583b66-6dbd-415a-b113-873eb19c4d4c-kube-api-access-bxlqr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190887 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3d439193-5483-476d-9612-99dfb4558121-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.190905 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-srv-cert\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191044 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbtd2\" (UniqueName: \"kubernetes.io/projected/eb99950d-0dff-4eba-aa36-f369fc4fdca6-kube-api-access-pbtd2\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191133 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3094818f-2083-4b5f-bf1d-691731224abd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191158 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm9ss\" (UniqueName: \"kubernetes.io/projected/3094818f-2083-4b5f-bf1d-691731224abd-kube-api-access-qm9ss\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191194 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191226 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191313 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a86819ff-277a-4ef9-8bef-3de42f9406fc-signing-key\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191578 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-config\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191606 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1df0ec3-aa36-4685-8c9b-586beaf71340-secret-volume\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191636 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-tls\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191655 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv8f9\" (UniqueName: \"kubernetes.io/projected/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-kube-api-access-jv8f9\") pod \"image-pruner-29527200-5vtx4\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191677 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-console-config\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191732 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-service-ca\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191757 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83a888c-071a-44b4-be7a-e1e2d2bb638e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191781 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191802 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.191825 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf585f0-5546-4222-903b-4466b186896f-config\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192062 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5bc1e348-33c5-4bfa-984f-312b58bff4cd-images\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192109 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqpxp\" (UniqueName: \"kubernetes.io/projected/314520d4-89fd-44fc-8eff-57534f58a1d5-kube-api-access-gqpxp\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192192 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/314520d4-89fd-44fc-8eff-57534f58a1d5-tmpfs\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192249 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192271 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpcpv\" (UniqueName: \"kubernetes.io/projected/1b800839-991f-4aa2-be47-6cef1e5c81d0-kube-api-access-qpcpv\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192327 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-etcd-ca\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192365 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clft5\" (UniqueName: \"kubernetes.io/projected/beecb71f-3791-44c8-bee4-83585ee82c14-kube-api-access-clft5\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192433 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/52481e6d-f985-4177-a719-d12248f049ac-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mjrk7\" (UID: \"52481e6d-f985-4177-a719-d12248f049ac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192509 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdnd8\" (UniqueName: \"kubernetes.io/projected/51c6b6f9-3b1f-4028-acc1-dac92556b401-kube-api-access-vdnd8\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192533 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-certs\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192732 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc1e348-33c5-4bfa-984f-312b58bff4cd-config\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192772 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlzv\" (UniqueName: \"kubernetes.io/projected/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-kube-api-access-zwlzv\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192819 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-serving-cert\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192887 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-bound-sa-token\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192907 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192934 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-etcd-client\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.192982 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6599a587-de25-4c8d-82e6-102438cd2547-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.193036 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b800839-991f-4aa2-be47-6cef1e5c81d0-serving-cert\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.193250 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm2jj\" (UniqueName: \"kubernetes.io/projected/4145877d-bf92-4ebc-8552-df3e4680eaf5-kube-api-access-rm2jj\") pod \"downloads-7954f5f757-84vm6\" (UID: \"4145877d-bf92-4ebc-8552-df3e4680eaf5\") " pod="openshift-console/downloads-7954f5f757-84vm6" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.193439 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/314520d4-89fd-44fc-8eff-57534f58a1d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.193519 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c83a888c-071a-44b4-be7a-e1e2d2bb638e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.193559 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/314520d4-89fd-44fc-8eff-57534f58a1d5-webhook-cert\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.193647 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6599a587-de25-4c8d-82e6-102438cd2547-images\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.193729 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-etcd-service-ca\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.193751 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chrtq\" (UniqueName: \"kubernetes.io/projected/1f900c05-f074-4cf2-b045-8487fb6b95fc-kube-api-access-chrtq\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.194912 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb99950d-0dff-4eba-aa36-f369fc4fdca6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.194987 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba583b66-6dbd-415a-b113-873eb19c4d4c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.195146 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-etcd-ca\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.195356 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-etcd-service-ca\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.195444 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5bc1e348-33c5-4bfa-984f-312b58bff4cd-images\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.195513 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc1e348-33c5-4bfa-984f-312b58bff4cd-config\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.196079 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.196069 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-service-ca\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.196146 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/beecb71f-3791-44c8-bee4-83585ee82c14-console-oauth-config\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.196555 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-tls\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.197561 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-etcd-client\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.199052 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b67334-e090-4527-831d-e36d70482003-serving-cert\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.205212 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-console-config\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.207064 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b67334-e090-4527-831d-e36d70482003-serving-cert\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.207142 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-serving-cert\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.207179 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-trusted-ca-bundle\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.207218 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmvl\" (UniqueName: \"kubernetes.io/projected/b6ee1e80-1e94-4513-903d-0478a92070b4-kube-api-access-ztmvl\") pod \"control-plane-machine-set-operator-78cbb6b69f-fdrrq\" (UID: \"b6ee1e80-1e94-4513-903d-0478a92070b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.207633 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0bafe0-cacc-4996-9675-a87c77f3984b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-667sz\" (UID: \"ee0bafe0-cacc-4996-9675-a87c77f3984b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.207671 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jnz\" (UniqueName: \"kubernetes.io/projected/e0d315ba-d07b-4043-ba77-bf8859d16638-kube-api-access-k2jnz\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.208061 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd59cacf-f27e-4731-9dca-59b01415316f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.208139 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtv8\" (UniqueName: \"kubernetes.io/projected/53b67334-e090-4527-831d-e36d70482003-kube-api-access-zgtv8\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.208534 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6485254f-aeb0-4df4-ba6e-c9eaa0718933-node-pullsecrets\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.208875 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-config\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.208899 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-etcd-serving-ca\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.208924 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.208951 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9q2t\" (UniqueName: \"kubernetes.io/projected/4ac75298-9ac5-4b16-8110-455c43d00945-kube-api-access-w9q2t\") pod \"dns-operator-744455d44c-fqmc8\" (UID: \"4ac75298-9ac5-4b16-8110-455c43d00945\") " pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209157 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6485254f-aeb0-4df4-ba6e-c9eaa0718933-node-pullsecrets\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209283 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/beecb71f-3791-44c8-bee4-83585ee82c14-console-serving-cert\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209124 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-trusted-ca-bundle\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209316 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd59cacf-f27e-4731-9dca-59b01415316f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209482 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dklbx\" (UniqueName: \"kubernetes.io/projected/5bc1e348-33c5-4bfa-984f-312b58bff4cd-kube-api-access-dklbx\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209572 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t6tz\" (UniqueName: \"kubernetes.io/projected/3d439193-5483-476d-9612-99dfb4558121-kube-api-access-5t6tz\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209636 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-encryption-config\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209650 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-config\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209705 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-srv-cert\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209768 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-config\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209797 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-etcd-client\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209850 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-dir\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209882 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209908 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-node-bootstrap-token\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209960 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-config\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.209986 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6599a587-de25-4c8d-82e6-102438cd2547-proxy-tls\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210035 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-etcd-serving-ca\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210062 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-oauth-serving-cert\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210118 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210143 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210192 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7mlm\" (UniqueName: \"kubernetes.io/projected/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-kube-api-access-b7mlm\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210215 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpwzr\" (UniqueName: \"kubernetes.io/projected/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-kube-api-access-dpwzr\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210259 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-audit-policies\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210280 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s95bh\" (UniqueName: \"kubernetes.io/projected/6599a587-de25-4c8d-82e6-102438cd2547-kube-api-access-s95bh\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210300 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-certificates\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210339 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cdadab89-f0cf-4bd7-af7e-17c67a65688a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210357 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ac75298-9ac5-4b16-8110-455c43d00945-metrics-tls\") pod \"dns-operator-744455d44c-fqmc8\" (UID: \"4ac75298-9ac5-4b16-8110-455c43d00945\") " pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210373 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6485254f-aeb0-4df4-ba6e-c9eaa0718933-audit-dir\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210418 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfkqm\" (UniqueName: \"kubernetes.io/projected/e0f1e160-28c6-4ff5-8b24-8c962f120747-kube-api-access-dfkqm\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210454 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3094818f-2083-4b5f-bf1d-691731224abd-proxy-tls\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210501 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6ee1e80-1e94-4513-903d-0478a92070b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fdrrq\" (UID: \"b6ee1e80-1e94-4513-903d-0478a92070b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210533 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-audit\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210584 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1df0ec3-aa36-4685-8c9b-586beaf71340-config-volume\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210598 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0bafe0-cacc-4996-9675-a87c77f3984b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-667sz\" (UID: \"ee0bafe0-cacc-4996-9675-a87c77f3984b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210618 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5bc1e348-33c5-4bfa-984f-312b58bff4cd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210650 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53b67334-e090-4527-831d-e36d70482003-config\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210674 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ff05c62-73da-4640-8c3f-8b846c14296c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vmpkj\" (UID: \"6ff05c62-73da-4640-8c3f-8b846c14296c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210734 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210759 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2rpk\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-kube-api-access-m2rpk\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210797 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210819 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmmwh\" (UniqueName: \"kubernetes.io/projected/2cf585f0-5546-4222-903b-4466b186896f-kube-api-access-wmmwh\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210866 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cdadab89-f0cf-4bd7-af7e-17c67a65688a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210887 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210908 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210950 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h97rj\" (UniqueName: \"kubernetes.io/projected/a86819ff-277a-4ef9-8bef-3de42f9406fc-kube-api-access-h97rj\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210969 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.210995 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca\") pod \"image-pruner-29527200-5vtx4\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211045 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a86819ff-277a-4ef9-8bef-3de42f9406fc-signing-cabundle\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211084 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chh8p\" (UniqueName: \"kubernetes.io/projected/ee0bafe0-cacc-4996-9675-a87c77f3984b-kube-api-access-chh8p\") pod \"package-server-manager-789f6589d5-667sz\" (UID: \"ee0bafe0-cacc-4996-9675-a87c77f3984b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211142 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211202 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-encryption-config\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211222 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211241 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b800839-991f-4aa2-be47-6cef1e5c81d0-trusted-ca\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211258 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-config\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211283 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7kzn\" (UniqueName: \"kubernetes.io/projected/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-kube-api-access-t7kzn\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211301 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t46q\" (UniqueName: \"kubernetes.io/projected/a1df0ec3-aa36-4685-8c9b-586beaf71340-kube-api-access-5t46q\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211326 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-image-import-ca\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211344 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d315ba-d07b-4043-ba77-bf8859d16638-serving-cert\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211360 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211378 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-service-ca-bundle\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211394 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-stats-auth\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211415 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhq5z\" (UniqueName: \"kubernetes.io/projected/6ff05c62-73da-4640-8c3f-8b846c14296c-kube-api-access-fhq5z\") pod \"cluster-samples-operator-665b6dd947-vmpkj\" (UID: \"6ff05c62-73da-4640-8c3f-8b846c14296c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211436 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211453 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d439193-5483-476d-9612-99dfb4558121-serving-cert\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211478 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hzf7\" (UniqueName: \"kubernetes.io/projected/6485254f-aeb0-4df4-ba6e-c9eaa0718933-kube-api-access-2hzf7\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211496 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk5tl\" (UniqueName: \"kubernetes.io/projected/52481e6d-f985-4177-a719-d12248f049ac-kube-api-access-hk5tl\") pod \"multus-admission-controller-857f4d67dd-mjrk7\" (UID: \"52481e6d-f985-4177-a719-d12248f049ac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211512 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211528 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2cf585f0-5546-4222-903b-4466b186896f-machine-approver-tls\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211543 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b800839-991f-4aa2-be47-6cef1e5c81d0-config\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211558 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r569\" (UniqueName: \"kubernetes.io/projected/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-kube-api-access-7r569\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211577 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-client-ca\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211594 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd59cacf-f27e-4731-9dca-59b01415316f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211611 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba583b66-6dbd-415a-b113-873eb19c4d4c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211629 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211647 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hnqx\" (UniqueName: \"kubernetes.io/projected/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-kube-api-access-2hnqx\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211658 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-config\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211663 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtpk\" (UniqueName: \"kubernetes.io/projected/37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb-kube-api-access-jgtpk\") pod \"migrator-59844c95c7-xjnhh\" (UID: \"37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211751 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/51c6b6f9-3b1f-4028-acc1-dac92556b401-audit-dir\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211787 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb99950d-0dff-4eba-aa36-f369fc4fdca6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211824 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211852 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f900c05-f074-4cf2-b045-8487fb6b95fc-serving-cert\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211901 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-trusted-ca\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211921 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-service-ca-bundle\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211941 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-default-certificate\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211963 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cf585f0-5546-4222-903b-4466b186896f-auth-proxy-config\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.211989 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-profile-collector-cert\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.212015 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53b67334-e090-4527-831d-e36d70482003-etcd-client\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.212042 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-serving-cert\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.212247 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-encryption-config\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.212490 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/beecb71f-3791-44c8-bee4-83585ee82c14-oauth-serving-cert\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.213348 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-image-import-ca\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.213673 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cdadab89-f0cf-4bd7-af7e-17c67a65688a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.214082 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.214081 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-serving-cert\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.214167 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-certificates\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.214353 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/51c6b6f9-3b1f-4028-acc1-dac92556b401-audit-dir\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.214377 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:04.714356671 +0000 UTC m=+139.965944267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.214863 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-serving-cert\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.215437 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51c6b6f9-3b1f-4028-acc1-dac92556b401-etcd-client\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.215464 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-trusted-ca\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.216090 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/51c6b6f9-3b1f-4028-acc1-dac92556b401-audit-policies\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.216452 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-config\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.216493 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6485254f-aeb0-4df4-ba6e-c9eaa0718933-audit-dir\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.217376 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-audit\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.217899 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6485254f-aeb0-4df4-ba6e-c9eaa0718933-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.218352 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-client-ca\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.218864 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/beecb71f-3791-44c8-bee4-83585ee82c14-console-serving-cert\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.219061 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d315ba-d07b-4043-ba77-bf8859d16638-serving-cert\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.220055 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ac75298-9ac5-4b16-8110-455c43d00945-metrics-tls\") pod \"dns-operator-744455d44c-fqmc8\" (UID: \"4ac75298-9ac5-4b16-8110-455c43d00945\") " pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.220167 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cdadab89-f0cf-4bd7-af7e-17c67a65688a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.220175 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ff05c62-73da-4640-8c3f-8b846c14296c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vmpkj\" (UID: \"6ff05c62-73da-4640-8c3f-8b846c14296c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.220609 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5bc1e348-33c5-4bfa-984f-312b58bff4cd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.220808 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53b67334-e090-4527-831d-e36d70482003-etcd-client\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.221002 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6485254f-aeb0-4df4-ba6e-c9eaa0718933-encryption-config\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.221384 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.241974 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.261660 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.281625 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.302081 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313101 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.313269 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:04.813243908 +0000 UTC m=+140.064831414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313342 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/314520d4-89fd-44fc-8eff-57534f58a1d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313372 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/314520d4-89fd-44fc-8eff-57534f58a1d5-webhook-cert\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313397 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c83a888c-071a-44b4-be7a-e1e2d2bb638e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313417 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6599a587-de25-4c8d-82e6-102438cd2547-images\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313441 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chrtq\" (UniqueName: \"kubernetes.io/projected/1f900c05-f074-4cf2-b045-8487fb6b95fc-kube-api-access-chrtq\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313460 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb99950d-0dff-4eba-aa36-f369fc4fdca6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313492 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba583b66-6dbd-415a-b113-873eb19c4d4c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313515 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmvl\" (UniqueName: \"kubernetes.io/projected/b6ee1e80-1e94-4513-903d-0478a92070b4-kube-api-access-ztmvl\") pod \"control-plane-machine-set-operator-78cbb6b69f-fdrrq\" (UID: \"b6ee1e80-1e94-4513-903d-0478a92070b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313538 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd59cacf-f27e-4731-9dca-59b01415316f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313581 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313622 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd59cacf-f27e-4731-9dca-59b01415316f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313648 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t6tz\" (UniqueName: \"kubernetes.io/projected/3d439193-5483-476d-9612-99dfb4558121-kube-api-access-5t6tz\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313673 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-srv-cert\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313723 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-dir\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313746 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313769 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-node-bootstrap-token\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313791 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6599a587-de25-4c8d-82e6-102438cd2547-proxy-tls\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313809 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313824 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313851 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7mlm\" (UniqueName: \"kubernetes.io/projected/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-kube-api-access-b7mlm\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313871 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpwzr\" (UniqueName: \"kubernetes.io/projected/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-kube-api-access-dpwzr\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313888 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s95bh\" (UniqueName: \"kubernetes.io/projected/6599a587-de25-4c8d-82e6-102438cd2547-kube-api-access-s95bh\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313906 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfkqm\" (UniqueName: \"kubernetes.io/projected/e0f1e160-28c6-4ff5-8b24-8c962f120747-kube-api-access-dfkqm\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313927 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3094818f-2083-4b5f-bf1d-691731224abd-proxy-tls\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313945 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6ee1e80-1e94-4513-903d-0478a92070b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fdrrq\" (UID: \"b6ee1e80-1e94-4513-903d-0478a92070b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313963 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1df0ec3-aa36-4685-8c9b-586beaf71340-config-volume\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.313981 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314007 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmmwh\" (UniqueName: \"kubernetes.io/projected/2cf585f0-5546-4222-903b-4466b186896f-kube-api-access-wmmwh\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314032 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314051 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h97rj\" (UniqueName: \"kubernetes.io/projected/a86819ff-277a-4ef9-8bef-3de42f9406fc-kube-api-access-h97rj\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314067 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314088 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca\") pod \"image-pruner-29527200-5vtx4\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314106 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a86819ff-277a-4ef9-8bef-3de42f9406fc-signing-cabundle\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314131 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314149 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314165 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b800839-991f-4aa2-be47-6cef1e5c81d0-trusted-ca\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314182 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7kzn\" (UniqueName: \"kubernetes.io/projected/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-kube-api-access-t7kzn\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314197 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t46q\" (UniqueName: \"kubernetes.io/projected/a1df0ec3-aa36-4685-8c9b-586beaf71340-kube-api-access-5t46q\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314218 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314233 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-service-ca-bundle\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.314259 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-stats-auth\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.315259 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a86819ff-277a-4ef9-8bef-3de42f9406fc-signing-cabundle\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.316190 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-dir\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.317255 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1df0ec3-aa36-4685-8c9b-586beaf71340-config-volume\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.317613 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6ee1e80-1e94-4513-903d-0478a92070b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fdrrq\" (UID: \"b6ee1e80-1e94-4513-903d-0478a92070b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.317868 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319092 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319137 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d439193-5483-476d-9612-99dfb4558121-serving-cert\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319243 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk5tl\" (UniqueName: \"kubernetes.io/projected/52481e6d-f985-4177-a719-d12248f049ac-kube-api-access-hk5tl\") pod \"multus-admission-controller-857f4d67dd-mjrk7\" (UID: \"52481e6d-f985-4177-a719-d12248f049ac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319267 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319357 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2cf585f0-5546-4222-903b-4466b186896f-machine-approver-tls\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319383 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b800839-991f-4aa2-be47-6cef1e5c81d0-config\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319468 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r569\" (UniqueName: \"kubernetes.io/projected/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-kube-api-access-7r569\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319493 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd59cacf-f27e-4731-9dca-59b01415316f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319615 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba583b66-6dbd-415a-b113-873eb19c4d4c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319784 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtpk\" (UniqueName: \"kubernetes.io/projected/37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb-kube-api-access-jgtpk\") pod \"migrator-59844c95c7-xjnhh\" (UID: \"37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319810 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319901 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hnqx\" (UniqueName: \"kubernetes.io/projected/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-kube-api-access-2hnqx\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319931 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb99950d-0dff-4eba-aa36-f369fc4fdca6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319965 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f900c05-f074-4cf2-b045-8487fb6b95fc-serving-cert\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.319988 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69df0bca-c432-4fea-b47d-93a194a58243-metrics-tls\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320059 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320108 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-default-certificate\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320131 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cf585f0-5546-4222-903b-4466b186896f-auth-proxy-config\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320173 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-service-ca-bundle\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320192 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-profile-collector-cert\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320267 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320290 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-metrics-certs\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.320381 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:04.820363342 +0000 UTC m=+140.071950848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320790 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f900c05-f074-4cf2-b045-8487fb6b95fc-config\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320809 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83a888c-071a-44b4-be7a-e1e2d2bb638e-config\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.320828 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3d439193-5483-476d-9612-99dfb4558121-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.321182 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpc5q\" (UniqueName: \"kubernetes.io/projected/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-kube-api-access-wpc5q\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.321329 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxlqr\" (UniqueName: \"kubernetes.io/projected/ba583b66-6dbd-415a-b113-873eb19c4d4c-kube-api-access-bxlqr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.321353 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-srv-cert\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.321014 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-service-ca-bundle\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.321073 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.321100 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3d439193-5483-476d-9612-99dfb4558121-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322001 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322084 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbtd2\" (UniqueName: \"kubernetes.io/projected/eb99950d-0dff-4eba-aa36-f369fc4fdca6-kube-api-access-pbtd2\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322110 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322127 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322144 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a86819ff-277a-4ef9-8bef-3de42f9406fc-signing-key\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322161 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-config\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322180 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3094818f-2083-4b5f-bf1d-691731224abd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322202 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm9ss\" (UniqueName: \"kubernetes.io/projected/3094818f-2083-4b5f-bf1d-691731224abd-kube-api-access-qm9ss\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322220 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1df0ec3-aa36-4685-8c9b-586beaf71340-secret-volume\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322241 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv8f9\" (UniqueName: \"kubernetes.io/projected/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-kube-api-access-jv8f9\") pod \"image-pruner-29527200-5vtx4\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322268 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83a888c-071a-44b4-be7a-e1e2d2bb638e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322283 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf585f0-5546-4222-903b-4466b186896f-config\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322310 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322329 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqpxp\" (UniqueName: \"kubernetes.io/projected/314520d4-89fd-44fc-8eff-57534f58a1d5-kube-api-access-gqpxp\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322346 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69df0bca-c432-4fea-b47d-93a194a58243-config-volume\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322361 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glx4j\" (UniqueName: \"kubernetes.io/projected/69df0bca-c432-4fea-b47d-93a194a58243-kube-api-access-glx4j\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322378 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/314520d4-89fd-44fc-8eff-57534f58a1d5-tmpfs\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322397 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322414 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpcpv\" (UniqueName: \"kubernetes.io/projected/1b800839-991f-4aa2-be47-6cef1e5c81d0-kube-api-access-qpcpv\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322435 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/52481e6d-f985-4177-a719-d12248f049ac-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mjrk7\" (UID: \"52481e6d-f985-4177-a719-d12248f049ac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322466 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-certs\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322484 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlzv\" (UniqueName: \"kubernetes.io/projected/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-kube-api-access-zwlzv\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322500 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-serving-cert\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322522 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6599a587-de25-4c8d-82e6-102438cd2547-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.322539 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b800839-991f-4aa2-be47-6cef1e5c81d0-serving-cert\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.323016 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.323136 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-config\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.324048 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6599a587-de25-4c8d-82e6-102438cd2547-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.324218 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3094818f-2083-4b5f-bf1d-691731224abd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.324342 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.325845 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-srv-cert\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.326535 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a86819ff-277a-4ef9-8bef-3de42f9406fc-signing-key\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.327004 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-profile-collector-cert\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.327125 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.327626 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1df0ec3-aa36-4685-8c9b-586beaf71340-secret-volume\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.327737 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.328150 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-serving-cert\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.334295 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/314520d4-89fd-44fc-8eff-57534f58a1d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.334529 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/314520d4-89fd-44fc-8eff-57534f58a1d5-webhook-cert\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.334578 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/314520d4-89fd-44fc-8eff-57534f58a1d5-tmpfs\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.342192 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.353231 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-metrics-certs\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.362811 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.382568 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.395138 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-default-certificate\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.402962 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.410764 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-stats-auth\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.422485 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.422938 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.423136 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69df0bca-c432-4fea-b47d-93a194a58243-config-volume\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.423183 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glx4j\" (UniqueName: \"kubernetes.io/projected/69df0bca-c432-4fea-b47d-93a194a58243-kube-api-access-glx4j\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.423465 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:04.923428478 +0000 UTC m=+140.175015984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.423967 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.424001 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69df0bca-c432-4fea-b47d-93a194a58243-metrics-tls\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.424530 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:04.924499249 +0000 UTC m=+140.176086785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.429739 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-service-ca-bundle\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.443002 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.462438 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.490236 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.502678 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.513840 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2cf585f0-5546-4222-903b-4466b186896f-machine-approver-tls\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.523606 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.524388 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.524552 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.024527199 +0000 UTC m=+140.276114715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.524897 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.525342 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.025331552 +0000 UTC m=+140.276919078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.530900 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cf585f0-5546-4222-903b-4466b186896f-auth-proxy-config\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.542441 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.543705 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cf585f0-5546-4222-903b-4466b186896f-config\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.563005 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.582159 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.595634 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba583b66-6dbd-415a-b113-873eb19c4d4c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.602129 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.622056 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.625616 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba583b66-6dbd-415a-b113-873eb19c4d4c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.625945 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.626121 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.126098023 +0000 UTC m=+140.377685559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.630429 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.631329 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.131283152 +0000 UTC m=+140.382870698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.642275 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.661987 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.681375 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.703021 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.731624 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.732393 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.232350411 +0000 UTC m=+140.483937917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.732670 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.733399 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.733535 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.233528225 +0000 UTC m=+140.485115721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.738346 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b800839-991f-4aa2-be47-6cef1e5c81d0-trusted-ca\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.742237 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.750506 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b800839-991f-4aa2-be47-6cef1e5c81d0-config\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.762351 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.782442 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.787856 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b800839-991f-4aa2-be47-6cef1e5c81d0-serving-cert\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.832858 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mv5s\" (UniqueName: \"kubernetes.io/projected/693c1181-a8ca-4d12-8136-8bfad07df623-kube-api-access-5mv5s\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.833824 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.833996 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.333964996 +0000 UTC m=+140.585552542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.835243 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.335223822 +0000 UTC m=+140.586811358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.835426 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.843008 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.851930 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls2rr\" (UniqueName: \"kubernetes.io/projected/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-kube-api-access-ls2rr\") pod \"controller-manager-879f6c89f-bp67z\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.852394 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3094818f-2083-4b5f-bf1d-691731224abd-proxy-tls\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.861365 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.864247 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.883226 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.903120 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.922768 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.935120 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d439193-5483-476d-9612-99dfb4558121-serving-cert\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.937628 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.938017 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.43798064 +0000 UTC m=+140.689568196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.938599 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:04 crc kubenswrapper[4906]: E0221 00:10:04.939116 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.439096062 +0000 UTC m=+140.690683598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.944058 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 21 00:10:04 crc kubenswrapper[4906]: I0221 00:10:04.983004 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4fc2e33-56c9-440b-a7cc-ea9982d47658-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.008852 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4fkr\" (UniqueName: \"kubernetes.io/projected/e4fc2e33-56c9-440b-a7cc-ea9982d47658-kube-api-access-n4fkr\") pod \"ingress-operator-5b745b69d9-bk7l4\" (UID: \"e4fc2e33-56c9-440b-a7cc-ea9982d47658\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.050628 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.052811 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.552784592 +0000 UTC m=+140.804372108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.053197 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.053465 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.054151 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.054392 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6599a587-de25-4c8d-82e6-102438cd2547-images\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.055667 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/693c1181-a8ca-4d12-8136-8bfad07df623-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pvsfj\" (UID: \"693c1181-a8ca-4d12-8136-8bfad07df623\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.063837 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.072886 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6599a587-de25-4c8d-82e6-102438cd2547-proxy-tls\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.082786 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.120235 4906 request.go:700] Waited for 1.017987976s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/secrets?fieldSelector=metadata.name%3Dv4-0-config-user-template-provider-selection&limit=500&resourceVersion=0 Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.122190 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.134639 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.138732 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.141821 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.150896 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.151265 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.154252 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.154731 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.654714607 +0000 UTC m=+140.906302123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.162547 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.167510 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.182998 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.197513 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bp67z"] Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.203335 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.210498 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.222373 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.229434 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/52481e6d-f985-4177-a719-d12248f049ac-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mjrk7\" (UID: \"52481e6d-f985-4177-a719-d12248f049ac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.243730 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.249556 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.255048 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.255255 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.75522238 +0000 UTC m=+141.006809906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.255595 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.256246 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.756014713 +0000 UTC m=+141.007602229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.268404 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.282354 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.285895 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4"] Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.289644 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: W0221 00:10:05.290937 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4fc2e33_56c9_440b_a7cc_ea9982d47658.slice/crio-db22ef6c68f99b3bef1dd6e4bedbbee6a5d2d7879b8de98cfc39512fbfd78058 WatchSource:0}: Error finding container db22ef6c68f99b3bef1dd6e4bedbbee6a5d2d7879b8de98cfc39512fbfd78058: Status 404 returned error can't find the container with id db22ef6c68f99b3bef1dd6e4bedbbee6a5d2d7879b8de98cfc39512fbfd78058 Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.309435 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.314002 4906 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.314126 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd59cacf-f27e-4731-9dca-59b01415316f-config podName:cd59cacf-f27e-4731-9dca-59b01415316f nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.814087584 +0000 UTC m=+141.065675110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/cd59cacf-f27e-4731-9dca-59b01415316f-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" (UID: "cd59cacf-f27e-4731-9dca-59b01415316f") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.314291 4906 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.314430 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb99950d-0dff-4eba-aa36-f369fc4fdca6-serving-cert podName:eb99950d-0dff-4eba-aa36-f369fc4fdca6 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.814409123 +0000 UTC m=+141.065996629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/eb99950d-0dff-4eba-aa36-f369fc4fdca6-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-84ldr" (UID: "eb99950d-0dff-4eba-aa36-f369fc4fdca6") : failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.315957 4906 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.315997 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-node-bootstrap-token podName:8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.815986288 +0000 UTC m=+141.067573794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-node-bootstrap-token") pod "machine-config-server-p5j2t" (UID: "8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e") : failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316039 4906 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316070 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca podName:e0f1e160-28c6-4ff5-8b24-8c962f120747 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.8160607 +0000 UTC m=+141.067648256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-9kg2j" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316101 4906 configmap.go:193] Couldn't get configMap openshift-image-registry/serviceca: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316123 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca podName:eb81731c-9eb7-4bf7-a263-a2117fabb5cc nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.816116572 +0000 UTC m=+141.067704068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serviceca" (UniqueName: "kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca") pod "image-pruner-29527200-5vtx4" (UID: "eb81731c-9eb7-4bf7-a263-a2117fabb5cc") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316156 4906 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316231 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies podName:e0f1e160-28c6-4ff5-8b24-8c962f120747 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.816216545 +0000 UTC m=+141.067804071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies") pod "oauth-openshift-558db77b4-9kg2j" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316284 4906 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316317 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd59cacf-f27e-4731-9dca-59b01415316f-serving-cert podName:cd59cacf-f27e-4731-9dca-59b01415316f nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.816307587 +0000 UTC m=+141.067895113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/cd59cacf-f27e-4731-9dca-59b01415316f-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" (UID: "cd59cacf-f27e-4731-9dca-59b01415316f") : failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316369 4906 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.316436 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-srv-cert podName:b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.816426041 +0000 UTC m=+141.068013537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-srv-cert") pod "catalog-operator-68c6474976-79w9t" (UID: "b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0") : failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.319490 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.319503 4906 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.319584 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig podName:e0f1e160-28c6-4ff5-8b24-8c962f120747 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.81956351 +0000 UTC m=+141.071151016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-9kg2j" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320755 4906 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320755 4906 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320788 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle podName:e0f1e160-28c6-4ff5-8b24-8c962f120747 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.820779555 +0000 UTC m=+141.072367061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-9kg2j" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320856 4906 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320871 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f900c05-f074-4cf2-b045-8487fb6b95fc-serving-cert podName:1f900c05-f074-4cf2-b045-8487fb6b95fc nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.820829086 +0000 UTC m=+141.072416612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1f900c05-f074-4cf2-b045-8487fb6b95fc-serving-cert") pod "service-ca-operator-777779d784-sjlp2" (UID: "1f900c05-f074-4cf2-b045-8487fb6b95fc") : failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320882 4906 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320895 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb99950d-0dff-4eba-aa36-f369fc4fdca6-config podName:eb99950d-0dff-4eba-aa36-f369fc4fdca6 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.820883758 +0000 UTC m=+141.072471314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/eb99950d-0dff-4eba-aa36-f369fc4fdca6-config") pod "kube-storage-version-migrator-operator-b67b599dd-84ldr" (UID: "eb99950d-0dff-4eba-aa36-f369fc4fdca6") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320914 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1f900c05-f074-4cf2-b045-8487fb6b95fc-config podName:1f900c05-f074-4cf2-b045-8487fb6b95fc nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.820903259 +0000 UTC m=+141.072490855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1f900c05-f074-4cf2-b045-8487fb6b95fc-config") pod "service-ca-operator-777779d784-sjlp2" (UID: "1f900c05-f074-4cf2-b045-8487fb6b95fc") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320933 4906 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.320959 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c83a888c-071a-44b4-be7a-e1e2d2bb638e-config podName:c83a888c-071a-44b4-be7a-e1e2d2bb638e nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.82095105 +0000 UTC m=+141.072538656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c83a888c-071a-44b4-be7a-e1e2d2bb638e-config") pod "kube-controller-manager-operator-78b949d7b-rhvm5" (UID: "c83a888c-071a-44b4-be7a-e1e2d2bb638e") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.321891 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.322104 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" event={"ID":"e4fc2e33-56c9-440b-a7cc-ea9982d47658","Type":"ContainerStarted","Data":"db22ef6c68f99b3bef1dd6e4bedbbee6a5d2d7879b8de98cfc39512fbfd78058"} Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.322997 4906 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.323028 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83a888c-071a-44b4-be7a-e1e2d2bb638e-serving-cert podName:c83a888c-071a-44b4-be7a-e1e2d2bb638e nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.823020319 +0000 UTC m=+141.074607825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c83a888c-071a-44b4-be7a-e1e2d2bb638e-serving-cert") pod "kube-controller-manager-operator-78b949d7b-rhvm5" (UID: "c83a888c-071a-44b4-be7a-e1e2d2bb638e") : failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.323205 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" event={"ID":"f51d9e17-fed2-4d4a-aeab-8b135b6222fb","Type":"ContainerStarted","Data":"adab02a6f226723c45a0e4badc16e4e4d2148e7aef61fdf16f7225de03513e8c"} Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.323655 4906 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.323693 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-certs podName:8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.823672828 +0000 UTC m=+141.075260334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-certs") pod "machine-config-server-p5j2t" (UID: "8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e") : failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.340533 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.343715 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.357428 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.357518 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.857498555 +0000 UTC m=+141.109086061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.358055 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.358394 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.85837954 +0000 UTC m=+141.109967046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.361921 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.381864 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.403784 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.422459 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.423707 4906 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.423795 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69df0bca-c432-4fea-b47d-93a194a58243-config-volume podName:69df0bca-c432-4fea-b47d-93a194a58243 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.92377339 +0000 UTC m=+141.175360906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/69df0bca-c432-4fea-b47d-93a194a58243-config-volume") pod "dns-default-gqmhh" (UID: "69df0bca-c432-4fea-b47d-93a194a58243") : failed to sync configmap cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.424928 4906 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.425042 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69df0bca-c432-4fea-b47d-93a194a58243-metrics-tls podName:69df0bca-c432-4fea-b47d-93a194a58243 nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.925018735 +0000 UTC m=+141.176606241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69df0bca-c432-4fea-b47d-93a194a58243-metrics-tls") pod "dns-default-gqmhh" (UID: "69df0bca-c432-4fea-b47d-93a194a58243") : failed to sync secret cache: timed out waiting for the condition Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.441940 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.459208 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.459931 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:05.959914173 +0000 UTC m=+141.211501679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.462278 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.481546 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.511935 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj"] Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.511991 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.521633 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.542194 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.560838 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.561321 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.061307212 +0000 UTC m=+141.312894728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.563308 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.581778 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.602133 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.622237 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.642639 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.662174 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.662612 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.662726 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.16264416 +0000 UTC m=+141.414231706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.663335 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.664042 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.164020299 +0000 UTC m=+141.415607835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.682045 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.702390 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.723138 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.742745 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.762588 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.765303 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.765590 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.265546752 +0000 UTC m=+141.517134328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.766810 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.767497 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.267466787 +0000 UTC m=+141.519054353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.782144 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.802310 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.822109 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.842766 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.863216 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.868526 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.868741 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.368711882 +0000 UTC m=+141.620299428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.869581 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca\") pod \"image-pruner-29527200-5vtx4\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.869765 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.869915 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.870031 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.870155 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb99950d-0dff-4eba-aa36-f369fc4fdca6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.870253 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.870316 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f900c05-f074-4cf2-b045-8487fb6b95fc-serving-cert\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.870399 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f900c05-f074-4cf2-b045-8487fb6b95fc-config\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.870468 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83a888c-071a-44b4-be7a-e1e2d2bb638e-config\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.870815 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83a888c-071a-44b4-be7a-e1e2d2bb638e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871002 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-certs\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871272 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871285 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871386 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb99950d-0dff-4eba-aa36-f369fc4fdca6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871498 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd59cacf-f27e-4731-9dca-59b01415316f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871590 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd59cacf-f27e-4731-9dca-59b01415316f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871678 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-srv-cert\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871773 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-node-bootstrap-token\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.871998 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.872250 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca\") pod \"image-pruner-29527200-5vtx4\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.872632 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83a888c-071a-44b4-be7a-e1e2d2bb638e-config\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.872973 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.372942593 +0000 UTC m=+141.624530139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.873242 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb99950d-0dff-4eba-aa36-f369fc4fdca6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.873731 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f900c05-f074-4cf2-b045-8487fb6b95fc-config\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.875323 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.875655 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.879379 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f900c05-f074-4cf2-b045-8487fb6b95fc-serving-cert\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.879438 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83a888c-071a-44b4-be7a-e1e2d2bb638e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.880329 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-srv-cert\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.881901 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb99950d-0dff-4eba-aa36-f369fc4fdca6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.883797 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.898455 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd59cacf-f27e-4731-9dca-59b01415316f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.901973 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.903650 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd59cacf-f27e-4731-9dca-59b01415316f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.923256 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.927983 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-node-bootstrap-token\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.942991 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.946613 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-certs\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.962849 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.972939 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.973105 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.473085076 +0000 UTC m=+141.724672592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.973360 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.973441 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69df0bca-c432-4fea-b47d-93a194a58243-metrics-tls\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.973648 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69df0bca-c432-4fea-b47d-93a194a58243-config-volume\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:05 crc kubenswrapper[4906]: E0221 00:10:05.973824 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.473813547 +0000 UTC m=+141.725401063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.977713 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69df0bca-c432-4fea-b47d-93a194a58243-metrics-tls\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.981811 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 21 00:10:05 crc kubenswrapper[4906]: I0221 00:10:05.985100 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69df0bca-c432-4fea-b47d-93a194a58243-config-volume\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.002308 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.023075 4906 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.042978 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.062137 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.078329 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.080891 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.580846877 +0000 UTC m=+141.832434463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.084190 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.104849 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.120873 4906 request.go:700] Waited for 1.933269196s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.125010 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.142739 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.181999 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.182705 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.682661608 +0000 UTC m=+141.934249124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.185310 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gc8hw\" (UID: \"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.189296 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.205445 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdnd8\" (UniqueName: \"kubernetes.io/projected/51c6b6f9-3b1f-4028-acc1-dac92556b401-kube-api-access-vdnd8\") pod \"apiserver-7bbb656c7d-r9sr7\" (UID: \"51c6b6f9-3b1f-4028-acc1-dac92556b401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.223735 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clft5\" (UniqueName: \"kubernetes.io/projected/beecb71f-3791-44c8-bee4-83585ee82c14-kube-api-access-clft5\") pod \"console-f9d7485db-86b7j\" (UID: \"beecb71f-3791-44c8-bee4-83585ee82c14\") " pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.229905 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.237941 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-bound-sa-token\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.263971 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm2jj\" (UniqueName: \"kubernetes.io/projected/4145877d-bf92-4ebc-8552-df3e4680eaf5-kube-api-access-rm2jj\") pod \"downloads-7954f5f757-84vm6\" (UID: \"4145877d-bf92-4ebc-8552-df3e4680eaf5\") " pod="openshift-console/downloads-7954f5f757-84vm6" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.283184 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.283928 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.783906183 +0000 UTC m=+142.035493689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.288822 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jnz\" (UniqueName: \"kubernetes.io/projected/e0d315ba-d07b-4043-ba77-bf8859d16638-kube-api-access-k2jnz\") pod \"route-controller-manager-6576b87f9c-w4fcj\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.298200 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtv8\" (UniqueName: \"kubernetes.io/projected/53b67334-e090-4527-831d-e36d70482003-kube-api-access-zgtv8\") pod \"etcd-operator-b45778765-nbpnr\" (UID: \"53b67334-e090-4527-831d-e36d70482003\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.319551 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9q2t\" (UniqueName: \"kubernetes.io/projected/4ac75298-9ac5-4b16-8110-455c43d00945-kube-api-access-w9q2t\") pod \"dns-operator-744455d44c-fqmc8\" (UID: \"4ac75298-9ac5-4b16-8110-455c43d00945\") " pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.331166 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" event={"ID":"f51d9e17-fed2-4d4a-aeab-8b135b6222fb","Type":"ContainerStarted","Data":"173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f"} Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.331340 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.333076 4906 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bp67z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.333125 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" podUID="f51d9e17-fed2-4d4a-aeab-8b135b6222fb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.336886 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dklbx\" (UniqueName: \"kubernetes.io/projected/5bc1e348-33c5-4bfa-984f-312b58bff4cd-kube-api-access-dklbx\") pod \"machine-api-operator-5694c8668f-bbxgp\" (UID: \"5bc1e348-33c5-4bfa-984f-312b58bff4cd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.337594 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" event={"ID":"693c1181-a8ca-4d12-8136-8bfad07df623","Type":"ContainerStarted","Data":"4d97d03d3ff1d758df8c61bb37350fec7e00e830b7fcfd58325dd1cd2d02d84e"} Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.337655 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" event={"ID":"693c1181-a8ca-4d12-8136-8bfad07df623","Type":"ContainerStarted","Data":"c134448463a47532d4db6f1933bbc8da2e8b4719f857d25eab56b0dfd991390f"} Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.340788 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" event={"ID":"e4fc2e33-56c9-440b-a7cc-ea9982d47658","Type":"ContainerStarted","Data":"b48fef1cad08c2d3cefb5d9c5da3f7df05cc9dbb9158cf3b9ac6cd0e80933603"} Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.340819 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" event={"ID":"e4fc2e33-56c9-440b-a7cc-ea9982d47658","Type":"ContainerStarted","Data":"7437f995befd4fbd1dff1e02868ac1e544a9344f1c2a28ff7fe49169be6decfc"} Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.380360 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hzf7\" (UniqueName: \"kubernetes.io/projected/6485254f-aeb0-4df4-ba6e-c9eaa0718933-kube-api-access-2hzf7\") pod \"apiserver-76f77b778f-6578b\" (UID: \"6485254f-aeb0-4df4-ba6e-c9eaa0718933\") " pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.384797 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.385160 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.885147188 +0000 UTC m=+142.136734694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.400266 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhq5z\" (UniqueName: \"kubernetes.io/projected/6ff05c62-73da-4640-8c3f-8b846c14296c-kube-api-access-fhq5z\") pod \"cluster-samples-operator-665b6dd947-vmpkj\" (UID: \"6ff05c62-73da-4640-8c3f-8b846c14296c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.403518 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw"] Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.421620 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2rpk\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-kube-api-access-m2rpk\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.435903 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chh8p\" (UniqueName: \"kubernetes.io/projected/ee0bafe0-cacc-4996-9675-a87c77f3984b-kube-api-access-chh8p\") pod \"package-server-manager-789f6589d5-667sz\" (UID: \"ee0bafe0-cacc-4996-9675-a87c77f3984b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.456713 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.457498 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.485280 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.492902 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.493595 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:06.993577148 +0000 UTC m=+142.245164654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.493948 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c83a888c-071a-44b4-be7a-e1e2d2bb638e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rhvm5\" (UID: \"c83a888c-071a-44b4-be7a-e1e2d2bb638e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.504939 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-86b7j"] Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.505215 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.527157 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chrtq\" (UniqueName: \"kubernetes.io/projected/1f900c05-f074-4cf2-b045-8487fb6b95fc-kube-api-access-chrtq\") pod \"service-ca-operator-777779d784-sjlp2\" (UID: \"1f900c05-f074-4cf2-b045-8487fb6b95fc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.537744 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmvl\" (UniqueName: \"kubernetes.io/projected/b6ee1e80-1e94-4513-903d-0478a92070b4-kube-api-access-ztmvl\") pod \"control-plane-machine-set-operator-78cbb6b69f-fdrrq\" (UID: \"b6ee1e80-1e94-4513-903d-0478a92070b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.554886 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7kzn\" (UniqueName: \"kubernetes.io/projected/96f08c18-0ef6-469c-a9bb-3fa51058fb4d-kube-api-access-t7kzn\") pod \"openshift-apiserver-operator-796bbdcf4f-lv2pm\" (UID: \"96f08c18-0ef6-469c-a9bb-3fa51058fb4d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.555146 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-84vm6" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.564256 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmmwh\" (UniqueName: \"kubernetes.io/projected/2cf585f0-5546-4222-903b-4466b186896f-kube-api-access-wmmwh\") pod \"machine-approver-56656f9798-dnk5c\" (UID: \"2cf585f0-5546-4222-903b-4466b186896f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.572204 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.575291 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.588355 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t6tz\" (UniqueName: \"kubernetes.io/projected/3d439193-5483-476d-9612-99dfb4558121-kube-api-access-5t6tz\") pod \"openshift-config-operator-7777fb866f-kf6pc\" (UID: \"3d439193-5483-476d-9612-99dfb4558121\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.595573 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.595890 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.095878493 +0000 UTC m=+142.347465999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.608710 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s95bh\" (UniqueName: \"kubernetes.io/projected/6599a587-de25-4c8d-82e6-102438cd2547-kube-api-access-s95bh\") pod \"machine-config-operator-74547568cd-v8z45\" (UID: \"6599a587-de25-4c8d-82e6-102438cd2547\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.623223 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t46q\" (UniqueName: \"kubernetes.io/projected/a1df0ec3-aa36-4685-8c9b-586beaf71340-kube-api-access-5t46q\") pod \"collect-profiles-29527200-nc4ns\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.624576 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.630051 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.639984 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.652141 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.662936 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.670679 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpwzr\" (UniqueName: \"kubernetes.io/projected/ead04ba4-ffa8-4bf8-ae26-c9014dfda96f-kube-api-access-dpwzr\") pod \"router-default-5444994796-bzr2z\" (UID: \"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f\") " pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.681489 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.687418 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.691209 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfkqm\" (UniqueName: \"kubernetes.io/projected/e0f1e160-28c6-4ff5-8b24-8c962f120747-kube-api-access-dfkqm\") pod \"oauth-openshift-558db77b4-9kg2j\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.696643 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.697142 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.197126058 +0000 UTC m=+142.448713564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.699941 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.700948 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h97rj\" (UniqueName: \"kubernetes.io/projected/a86819ff-277a-4ef9-8bef-3de42f9406fc-kube-api-access-h97rj\") pod \"service-ca-9c57cc56f-6v9gm\" (UID: \"a86819ff-277a-4ef9-8bef-3de42f9406fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.701644 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.716774 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7mlm\" (UniqueName: \"kubernetes.io/projected/e8bebf2a-8dd4-4ddc-a74a-e13b789e7674-kube-api-access-b7mlm\") pod \"olm-operator-6b444d44fb-tn7tl\" (UID: \"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.718013 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk5tl\" (UniqueName: \"kubernetes.io/projected/52481e6d-f985-4177-a719-d12248f049ac-kube-api-access-hk5tl\") pod \"multus-admission-controller-857f4d67dd-mjrk7\" (UID: \"52481e6d-f985-4177-a719-d12248f049ac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.736251 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.746914 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.756088 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r569\" (UniqueName: \"kubernetes.io/projected/8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e-kube-api-access-7r569\") pod \"machine-config-server-p5j2t\" (UID: \"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e\") " pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.759269 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd59cacf-f27e-4731-9dca-59b01415316f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hlxkl\" (UID: \"cd59cacf-f27e-4731-9dca-59b01415316f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.771331 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-p5j2t" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.789062 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtpk\" (UniqueName: \"kubernetes.io/projected/37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb-kube-api-access-jgtpk\") pod \"migrator-59844c95c7-xjnhh\" (UID: \"37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.798390 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.798726 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.298712303 +0000 UTC m=+142.550299809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.821287 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hnqx\" (UniqueName: \"kubernetes.io/projected/6f649ea1-47f1-4ae4-8b05-a5b5e2503b06-kube-api-access-2hnqx\") pod \"authentication-operator-69f744f599-wxwm4\" (UID: \"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.822057 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpc5q\" (UniqueName: \"kubernetes.io/projected/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-kube-api-access-wpc5q\") pod \"marketplace-operator-79b997595-fm5j7\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:06 crc kubenswrapper[4906]: W0221 00:10:06.841917 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf481e0_f3f1_4e65_8ef0_5ac700f2a31e.slice/crio-c82870b6f8c6acb83ef8e46243175631bcafa618c7ec52e53285e73c6a965cca WatchSource:0}: Error finding container c82870b6f8c6acb83ef8e46243175631bcafa618c7ec52e53285e73c6a965cca: Status 404 returned error can't find the container with id c82870b6f8c6acb83ef8e46243175631bcafa618c7ec52e53285e73c6a965cca Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.842921 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxlqr\" (UniqueName: \"kubernetes.io/projected/ba583b66-6dbd-415a-b113-873eb19c4d4c-kube-api-access-bxlqr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gw6k\" (UID: \"ba583b66-6dbd-415a-b113-873eb19c4d4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.860922 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbtd2\" (UniqueName: \"kubernetes.io/projected/eb99950d-0dff-4eba-aa36-f369fc4fdca6-kube-api-access-pbtd2\") pod \"kube-storage-version-migrator-operator-b67b599dd-84ldr\" (UID: \"eb99950d-0dff-4eba-aa36-f369fc4fdca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.882447 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.895766 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.896756 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv8f9\" (UniqueName: \"kubernetes.io/projected/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-kube-api-access-jv8f9\") pod \"image-pruner-29527200-5vtx4\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.904588 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm9ss\" (UniqueName: \"kubernetes.io/projected/3094818f-2083-4b5f-bf1d-691731224abd-kube-api-access-qm9ss\") pod \"machine-config-controller-84d6567774-qxg6m\" (UID: \"3094818f-2083-4b5f-bf1d-691731224abd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.907087 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:06 crc kubenswrapper[4906]: E0221 00:10:06.907619 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.407598936 +0000 UTC m=+142.659186442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.924059 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpcpv\" (UniqueName: \"kubernetes.io/projected/1b800839-991f-4aa2-be47-6cef1e5c81d0-kube-api-access-qpcpv\") pod \"console-operator-58897d9998-rj6xk\" (UID: \"1b800839-991f-4aa2-be47-6cef1e5c81d0\") " pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.932927 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.944372 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.947197 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bbxgp"] Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.956694 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.968851 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.969389 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqpxp\" (UniqueName: \"kubernetes.io/projected/314520d4-89fd-44fc-8eff-57534f58a1d5-kube-api-access-gqpxp\") pod \"packageserver-d55dfcdfc-w9crs\" (UID: \"314520d4-89fd-44fc-8eff-57534f58a1d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.976950 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.979014 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlzv\" (UniqueName: \"kubernetes.io/projected/b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0-kube-api-access-zwlzv\") pod \"catalog-operator-68c6474976-79w9t\" (UID: \"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:06 crc kubenswrapper[4906]: I0221 00:10:06.981043 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.020627 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.023293 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.023833 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.026765 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.027801 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.527769942 +0000 UTC m=+142.779357448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.033877 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.049987 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.056352 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.066632 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj"] Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.069725 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glx4j\" (UniqueName: \"kubernetes.io/projected/69df0bca-c432-4fea-b47d-93a194a58243-kube-api-access-glx4j\") pod \"dns-default-gqmhh\" (UID: \"69df0bca-c432-4fea-b47d-93a194a58243\") " pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.087417 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7"] Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.129985 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.130134 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-socket-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.130164 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6s5g\" (UniqueName: \"kubernetes.io/projected/43852740-4aee-4dcb-b90f-13e39c49ecc8-kube-api-access-f6s5g\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.130256 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27421dfc-6497-4ba3-8887-f5dc90ce57d9-cert\") pod \"ingress-canary-8l88x\" (UID: \"27421dfc-6497-4ba3-8887-f5dc90ce57d9\") " pod="openshift-ingress-canary/ingress-canary-8l88x" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.130316 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctz45\" (UniqueName: \"kubernetes.io/projected/27421dfc-6497-4ba3-8887-f5dc90ce57d9-kube-api-access-ctz45\") pod \"ingress-canary-8l88x\" (UID: \"27421dfc-6497-4ba3-8887-f5dc90ce57d9\") " pod="openshift-ingress-canary/ingress-canary-8l88x" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.130445 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-csi-data-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.130487 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-registration-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.130582 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-mountpoint-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.130661 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-plugins-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.131591 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.63156909 +0000 UTC m=+142.883156596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.189556 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232438 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-plugins-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232480 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-socket-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232502 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6s5g\" (UniqueName: \"kubernetes.io/projected/43852740-4aee-4dcb-b90f-13e39c49ecc8-kube-api-access-f6s5g\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232532 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27421dfc-6497-4ba3-8887-f5dc90ce57d9-cert\") pod \"ingress-canary-8l88x\" (UID: \"27421dfc-6497-4ba3-8887-f5dc90ce57d9\") " pod="openshift-ingress-canary/ingress-canary-8l88x" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232561 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232589 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctz45\" (UniqueName: \"kubernetes.io/projected/27421dfc-6497-4ba3-8887-f5dc90ce57d9-kube-api-access-ctz45\") pod \"ingress-canary-8l88x\" (UID: \"27421dfc-6497-4ba3-8887-f5dc90ce57d9\") " pod="openshift-ingress-canary/ingress-canary-8l88x" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232629 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-csi-data-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232648 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-registration-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232678 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-mountpoint-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.232782 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-mountpoint-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.233054 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.733041751 +0000 UTC m=+142.984629257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.233314 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-csi-data-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.233515 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-registration-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.233813 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-socket-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.233864 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/43852740-4aee-4dcb-b90f-13e39c49ecc8-plugins-dir\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.237059 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27421dfc-6497-4ba3-8887-f5dc90ce57d9-cert\") pod \"ingress-canary-8l88x\" (UID: \"27421dfc-6497-4ba3-8887-f5dc90ce57d9\") " pod="openshift-ingress-canary/ingress-canary-8l88x" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.280144 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6s5g\" (UniqueName: \"kubernetes.io/projected/43852740-4aee-4dcb-b90f-13e39c49ecc8-kube-api-access-f6s5g\") pod \"csi-hostpathplugin-x64jb\" (UID: \"43852740-4aee-4dcb-b90f-13e39c49ecc8\") " pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.282829 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctz45\" (UniqueName: \"kubernetes.io/projected/27421dfc-6497-4ba3-8887-f5dc90ce57d9-kube-api-access-ctz45\") pod \"ingress-canary-8l88x\" (UID: \"27421dfc-6497-4ba3-8887-f5dc90ce57d9\") " pod="openshift-ingress-canary/ingress-canary-8l88x" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.333563 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.333834 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.833798172 +0000 UTC m=+143.085385718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.350664 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bzr2z" event={"ID":"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f","Type":"ContainerStarted","Data":"761b46688519bdc8eb95518b8a67c586f14590eb21ec3e064c17f5fe34fa74d6"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.355235 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-p5j2t" event={"ID":"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e","Type":"ContainerStarted","Data":"0f3419353213d6b8808149a1543b27145c6eaedd1f52e63c36f9dba83b91658f"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.355304 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-p5j2t" event={"ID":"8cf481e0-f3f1-4e65-8ef0-5ac700f2a31e","Type":"ContainerStarted","Data":"c82870b6f8c6acb83ef8e46243175631bcafa618c7ec52e53285e73c6a965cca"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.356020 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nbpnr"] Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.361001 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" event={"ID":"2cf585f0-5546-4222-903b-4466b186896f","Type":"ContainerStarted","Data":"dc03f23a840ad0c46672d3ba2e13a4f1d75f7082168e7c51674b9d5d4b609198"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.362668 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" event={"ID":"5bc1e348-33c5-4bfa-984f-312b58bff4cd","Type":"ContainerStarted","Data":"fc927e44c082d1f1f2561755937f7306636b882d24d5a9ec69bf45894645bbed"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.364999 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-86b7j" event={"ID":"beecb71f-3791-44c8-bee4-83585ee82c14","Type":"ContainerStarted","Data":"63e6a6ebf690daed534a5271ee9d9b168fcd3857c1b457994662e585749817fe"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.365025 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-86b7j" event={"ID":"beecb71f-3791-44c8-bee4-83585ee82c14","Type":"ContainerStarted","Data":"f62be9148b2484e5c99d7f25acb7f13624fbb2f467c1416ce512727af4ab2cae"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.367737 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" event={"ID":"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8","Type":"ContainerStarted","Data":"0a617d142e914167abcf852a95569f1a2207485f13f545c7114fd1192203d8b7"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.367755 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" event={"ID":"edb7f37d-699d-46ba-bbe2-7dbb6b0f09f8","Type":"ContainerStarted","Data":"8a92c480961bfbb532cad662c713017d92343e24d52c12e7cdf92d31bc0dbc38"} Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.368005 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.387674 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x64jb" Feb 21 00:10:07 crc kubenswrapper[4906]: W0221 00:10:07.390778 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d315ba_d07b_4043_ba77_bf8859d16638.slice/crio-bf6e909d5f812860333cffe3579cd718aa224edd508444e4957465eb22e814c3 WatchSource:0}: Error finding container bf6e909d5f812860333cffe3579cd718aa224edd508444e4957465eb22e814c3: Status 404 returned error can't find the container with id bf6e909d5f812860333cffe3579cd718aa224edd508444e4957465eb22e814c3 Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.396605 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8l88x" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.434880 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.437058 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:07.937037194 +0000 UTC m=+143.188624700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.497218 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.537111 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.539775 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.039747971 +0000 UTC m=+143.291335507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.589985 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-86b7j" podStartSLOduration=122.589961876 podStartE2EDuration="2m2.589961876s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:07.587923028 +0000 UTC m=+142.839510534" watchObservedRunningTime="2026-02-21 00:10:07.589961876 +0000 UTC m=+142.841549402" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.643496 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.644230 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.144212568 +0000 UTC m=+143.395800074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.745597 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.745945 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.245928186 +0000 UTC m=+143.497515692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.791112 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc8hw" podStartSLOduration=121.791092277 podStartE2EDuration="2m1.791092277s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:07.790633304 +0000 UTC m=+143.042220810" watchObservedRunningTime="2026-02-21 00:10:07.791092277 +0000 UTC m=+143.042679773" Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.846795 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.847159 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.34714153 +0000 UTC m=+143.598729036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:07 crc kubenswrapper[4906]: I0221 00:10:07.949811 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:07 crc kubenswrapper[4906]: E0221 00:10:07.950413 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.450392212 +0000 UTC m=+143.701979718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.031675 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvsfj" podStartSLOduration=122.031657536 podStartE2EDuration="2m2.031657536s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:07.999073454 +0000 UTC m=+143.250660960" watchObservedRunningTime="2026-02-21 00:10:08.031657536 +0000 UTC m=+143.283245032" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.052646 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.053038 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.553020106 +0000 UTC m=+143.804607622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.153273 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.154655 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.654618211 +0000 UTC m=+143.906205737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.194505 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-p5j2t" podStartSLOduration=5.194474131 podStartE2EDuration="5.194474131s" podCreationTimestamp="2026-02-21 00:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:08.193374609 +0000 UTC m=+143.444962115" watchObservedRunningTime="2026-02-21 00:10:08.194474131 +0000 UTC m=+143.446061637" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.259091 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.259458 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.759444069 +0000 UTC m=+144.011031565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.291586 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" podStartSLOduration=122.291560857 podStartE2EDuration="2m2.291560857s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:08.291472914 +0000 UTC m=+143.543060420" watchObservedRunningTime="2026-02-21 00:10:08.291560857 +0000 UTC m=+143.543148373" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.366012 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.366769 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.866749126 +0000 UTC m=+144.118336632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.405026 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bzr2z" event={"ID":"ead04ba4-ffa8-4bf8-ae26-c9014dfda96f","Type":"ContainerStarted","Data":"1b01d0ee08546ee7bff2a94517f2ca2ca01cb33d8b62a2b1c6806b28e3302a7a"} Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.410172 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" event={"ID":"e0d315ba-d07b-4043-ba77-bf8859d16638","Type":"ContainerStarted","Data":"c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35"} Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.410222 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" event={"ID":"e0d315ba-d07b-4043-ba77-bf8859d16638","Type":"ContainerStarted","Data":"bf6e909d5f812860333cffe3579cd718aa224edd508444e4957465eb22e814c3"} Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.410732 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.411354 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" event={"ID":"53b67334-e090-4527-831d-e36d70482003","Type":"ContainerStarted","Data":"c6ddc13c928adb29f2388c84789f24be4925381d2839bfc07a121c64baa68737"} Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.411880 4906 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-w4fcj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.411917 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" podUID="e0d315ba-d07b-4043-ba77-bf8859d16638" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.417081 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" event={"ID":"51c6b6f9-3b1f-4028-acc1-dac92556b401","Type":"ContainerStarted","Data":"65604a879ded1c1177968e72b84209bdbff4363649d6525ecc2013ba3a4befe3"} Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.421601 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" event={"ID":"2cf585f0-5546-4222-903b-4466b186896f","Type":"ContainerStarted","Data":"54e67c980e3c5c1e517032cc5b65ee54aedfbcf2da94728751b983833e16a35d"} Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.439271 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" event={"ID":"5bc1e348-33c5-4bfa-984f-312b58bff4cd","Type":"ContainerStarted","Data":"cc75b68224ef6e00ec2fc3f4ef3e0f56d4419f952519689a06b7261f7eddef08"} Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.472167 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.479119 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:08.979102218 +0000 UTC m=+144.230689724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.556038 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bk7l4" podStartSLOduration=122.556021007 podStartE2EDuration="2m2.556021007s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:08.555604125 +0000 UTC m=+143.807191631" watchObservedRunningTime="2026-02-21 00:10:08.556021007 +0000 UTC m=+143.807608503" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.575322 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.575499 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.075445053 +0000 UTC m=+144.327032559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.575823 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.587462 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.087435856 +0000 UTC m=+144.339023362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.678443 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.678799 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.178773767 +0000 UTC m=+144.430361273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.679407 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.679778 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.179769516 +0000 UTC m=+144.431357022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.744399 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9kg2j"] Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.766038 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-84vm6"] Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.771500 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fqmc8"] Feb 21 00:10:08 crc kubenswrapper[4906]: W0221 00:10:08.772376 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac75298_9ac5_4b16_8110_455c43d00945.slice/crio-769d413c68b94d66d8dc1fa2103747626c9167592186a41d0318a2ac4eb74d87 WatchSource:0}: Error finding container 769d413c68b94d66d8dc1fa2103747626c9167592186a41d0318a2ac4eb74d87: Status 404 returned error can't find the container with id 769d413c68b94d66d8dc1fa2103747626c9167592186a41d0318a2ac4eb74d87 Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.780205 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.780674 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.28064199 +0000 UTC m=+144.532229506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.780794 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz"] Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.847307 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2"] Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.881127 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.881441 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.381425911 +0000 UTC m=+144.633013417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.897537 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-bzr2z" podStartSLOduration=122.897514812 podStartE2EDuration="2m2.897514812s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:08.890372097 +0000 UTC m=+144.141959603" watchObservedRunningTime="2026-02-21 00:10:08.897514812 +0000 UTC m=+144.149102318" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.959780 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.976084 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq"] Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.990859 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:08 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:08 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:08 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.990915 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:08 crc kubenswrapper[4906]: I0221 00:10:08.990942 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:08 crc kubenswrapper[4906]: E0221 00:10:08.991360 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.491340554 +0000 UTC m=+144.742928060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.052393 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.094528 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.103598 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.105751 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.605734835 +0000 UTC m=+144.857322341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.126065 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.173158 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6578b"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.184544 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" podStartSLOduration=123.184520008 podStartE2EDuration="2m3.184520008s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.160665026 +0000 UTC m=+144.412252532" watchObservedRunningTime="2026-02-21 00:10:09.184520008 +0000 UTC m=+144.436107514" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.206655 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.208578 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.708559585 +0000 UTC m=+144.960147091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.235783 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.277707 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" podStartSLOduration=123.277664561 podStartE2EDuration="2m3.277664561s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.267587163 +0000 UTC m=+144.519174669" watchObservedRunningTime="2026-02-21 00:10:09.277664561 +0000 UTC m=+144.529252067" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.284221 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.312605 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.313147 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.813130205 +0000 UTC m=+145.064717711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.313293 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.370559 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.413447 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.414571 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.914536534 +0000 UTC m=+145.166124040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.421492 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6v9gm"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.423026 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.423572 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:09.923553942 +0000 UTC m=+145.175141628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.450912 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.473766 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" event={"ID":"6ff05c62-73da-4640-8c3f-8b846c14296c","Type":"ContainerStarted","Data":"17ee3388cb414fbd5a97e6f52023c40d51fc82d60f46e6be4b6aa0303168638d"} Feb 21 00:10:09 crc kubenswrapper[4906]: W0221 00:10:09.475712 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda86819ff_277a_4ef9_8bef_3de42f9406fc.slice/crio-a041d6a1f7bf89a1ee1ac3fe66e538fe26f95a08d0db5968f406c81c8c09e870 WatchSource:0}: Error finding container a041d6a1f7bf89a1ee1ac3fe66e538fe26f95a08d0db5968f406c81c8c09e870: Status 404 returned error can't find the container with id a041d6a1f7bf89a1ee1ac3fe66e538fe26f95a08d0db5968f406c81c8c09e870 Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.487335 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" event={"ID":"4ac75298-9ac5-4b16-8110-455c43d00945","Type":"ContainerStarted","Data":"9b026782f1f3ce2c940c7d2731113deb22cf3aa1db2ae5ae6ae730a77b181d3a"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.487388 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" event={"ID":"4ac75298-9ac5-4b16-8110-455c43d00945","Type":"ContainerStarted","Data":"769d413c68b94d66d8dc1fa2103747626c9167592186a41d0318a2ac4eb74d87"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.492776 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.492858 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wxwm4"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.492873 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8l88x"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.497485 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.501539 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fm5j7"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.501606 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rj6xk"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.514384 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29527200-5vtx4"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.534056 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.534509 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.034492264 +0000 UTC m=+145.286079780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.534712 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" event={"ID":"c83a888c-071a-44b4-be7a-e1e2d2bb638e","Type":"ContainerStarted","Data":"0947a467038464326bbff75052b9b4e609f6d2a3e35ca8892e4caac8a6a02850"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.534763 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t"] Feb 21 00:10:09 crc kubenswrapper[4906]: W0221 00:10:09.536794 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b800839_991f_4aa2_be47_6cef1e5c81d0.slice/crio-b89160c9b02b6d2b5907d9ec992582b62c026b7bd13b1427f0e07e6fd9acf6d5 WatchSource:0}: Error finding container b89160c9b02b6d2b5907d9ec992582b62c026b7bd13b1427f0e07e6fd9acf6d5: Status 404 returned error can't find the container with id b89160c9b02b6d2b5907d9ec992582b62c026b7bd13b1427f0e07e6fd9acf6d5 Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.537653 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" event={"ID":"ee0bafe0-cacc-4996-9675-a87c77f3984b","Type":"ContainerStarted","Data":"aeda7fad2ced6094c4da5e4255b967b2bf9eeeafb652c7c357ffaf57aad489de"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.537765 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" event={"ID":"ee0bafe0-cacc-4996-9675-a87c77f3984b","Type":"ContainerStarted","Data":"95897b2286fec28ede3762d396cd40b7c83dbb920e918ecead176e239c64dcb8"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.539920 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" event={"ID":"53b67334-e090-4527-831d-e36d70482003","Type":"ContainerStarted","Data":"e8513e23fec88f0f76509408efee8e48952993aacea3abd7ca5178d570f2262b"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.553567 4906 generic.go:334] "Generic (PLEG): container finished" podID="51c6b6f9-3b1f-4028-acc1-dac92556b401" containerID="fd315e852d91c0859907fdcd81d79e7215a4cb1849b7120b0bbe0daa2ad70069" exitCode=0 Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.554576 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" event={"ID":"51c6b6f9-3b1f-4028-acc1-dac92556b401","Type":"ContainerDied","Data":"fd315e852d91c0859907fdcd81d79e7215a4cb1849b7120b0bbe0daa2ad70069"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.562374 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.563250 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.566797 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x64jb"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.566857 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mjrk7"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.577194 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.586307 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" event={"ID":"b6ee1e80-1e94-4513-903d-0478a92070b4","Type":"ContainerStarted","Data":"25353bcc18138abe11d0fbae73e553306b9c4b942eb44e71d1142e61b55912a1"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.615295 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" event={"ID":"2cf585f0-5546-4222-903b-4466b186896f","Type":"ContainerStarted","Data":"b924c78755042d8f4208c7b5ea26bb844d12e37c23ca55ae5be0018713466d6a"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.618740 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" podStartSLOduration=123.618721033 podStartE2EDuration="2m3.618721033s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.617849728 +0000 UTC m=+144.869437234" watchObservedRunningTime="2026-02-21 00:10:09.618721033 +0000 UTC m=+144.870308539" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.620395 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nbpnr" podStartSLOduration=123.62038363 podStartE2EDuration="2m3.62038363s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.595037465 +0000 UTC m=+144.846624971" watchObservedRunningTime="2026-02-21 00:10:09.62038363 +0000 UTC m=+144.871971136" Feb 21 00:10:09 crc kubenswrapper[4906]: W0221 00:10:09.625409 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b5f9cd_19b1_42ca_b24e_cab3ae7a0b6e.slice/crio-7eb5ae6babfcab802c3e1c8f31b07512f65c6eaa1d3d67ec71baf3e3d6e451e8 WatchSource:0}: Error finding container 7eb5ae6babfcab802c3e1c8f31b07512f65c6eaa1d3d67ec71baf3e3d6e451e8: Status 404 returned error can't find the container with id 7eb5ae6babfcab802c3e1c8f31b07512f65c6eaa1d3d67ec71baf3e3d6e451e8 Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.627474 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bbxgp" event={"ID":"5bc1e348-33c5-4bfa-984f-312b58bff4cd","Type":"ContainerStarted","Data":"3651d11e9ed7a252db06d7c837352edded634ef12d754b77fc4baf706fec5d2a"} Feb 21 00:10:09 crc kubenswrapper[4906]: W0221 00:10:09.628072 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb81731c_9eb7_4bf7_a263_a2117fabb5cc.slice/crio-1806e6738408a8f1264606ff0f51915018a4f262d32476698e91ee71b4f4d35d WatchSource:0}: Error finding container 1806e6738408a8f1264606ff0f51915018a4f262d32476698e91ee71b4f4d35d: Status 404 returned error can't find the container with id 1806e6738408a8f1264606ff0f51915018a4f262d32476698e91ee71b4f4d35d Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.636098 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.637634 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.137610033 +0000 UTC m=+145.389197539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.651742 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" event={"ID":"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674","Type":"ContainerStarted","Data":"6890a6cb975e86cfab83e1edd2708ae8c3b567941a52725f224f4b63015ad586"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.653645 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.654090 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dnk5c" podStartSLOduration=124.654069903 podStartE2EDuration="2m4.654069903s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.652643762 +0000 UTC m=+144.904231278" watchObservedRunningTime="2026-02-21 00:10:09.654069903 +0000 UTC m=+144.905657409" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.656818 4906 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tn7tl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.656867 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" podUID="e8bebf2a-8dd4-4ddc-a74a-e13b789e7674" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.672627 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gqmhh"] Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.687972 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-84vm6" event={"ID":"4145877d-bf92-4ebc-8552-df3e4680eaf5","Type":"ContainerStarted","Data":"8afad4ae3a2c94e908445316fab2ec97364d397aedf5500b92775e9888bbf59a"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.688037 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-84vm6" event={"ID":"4145877d-bf92-4ebc-8552-df3e4680eaf5","Type":"ContainerStarted","Data":"4c18116725c65336737ca6ba31c896932f4b8d1ada73254ad273f6329fd7d7fa"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.689348 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-84vm6" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.691263 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-84vm6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.692473 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-84vm6" podUID="4145877d-bf92-4ebc-8552-df3e4680eaf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.715419 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" event={"ID":"e0f1e160-28c6-4ff5-8b24-8c962f120747","Type":"ContainerStarted","Data":"10d2d9960efef7d1474dce7618c81a89160135371788af09ee874e16d051f65e"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.715709 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" podStartSLOduration=123.715642864 podStartE2EDuration="2m3.715642864s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.681002953 +0000 UTC m=+144.932590459" watchObservedRunningTime="2026-02-21 00:10:09.715642864 +0000 UTC m=+144.967230370" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.716708 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.718259 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-84vm6" podStartSLOduration=124.718244028 podStartE2EDuration="2m4.718244028s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.715596532 +0000 UTC m=+144.967184038" watchObservedRunningTime="2026-02-21 00:10:09.718244028 +0000 UTC m=+144.969831534" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.718539 4906 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9kg2j container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.41:6443/healthz\": dial tcp 10.217.0.41:6443: connect: connection refused" start-of-body= Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.718579 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" podUID="e0f1e160-28c6-4ff5-8b24-8c962f120747" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.41:6443/healthz\": dial tcp 10.217.0.41:6443: connect: connection refused" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.740374 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.741051 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" podStartSLOduration=123.74103149 podStartE2EDuration="2m3.74103149s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.739520896 +0000 UTC m=+144.991108402" watchObservedRunningTime="2026-02-21 00:10:09.74103149 +0000 UTC m=+144.992618996" Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.745722 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.245664372 +0000 UTC m=+145.497251898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.765559 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" event={"ID":"1f900c05-f074-4cf2-b045-8487fb6b95fc","Type":"ContainerStarted","Data":"2eadc7a736d304e195ed8b6502c53cfeb194d3cd41cd0107ddf6b92b24ea172e"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.765598 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" event={"ID":"1f900c05-f074-4cf2-b045-8487fb6b95fc","Type":"ContainerStarted","Data":"9942469603d996f46342a32231370e21dd33bc8024fe47403691221f35e25306"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.768290 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" event={"ID":"96f08c18-0ef6-469c-a9bb-3fa51058fb4d","Type":"ContainerStarted","Data":"e87a04a93315e8172882404d1eb2d9428c8eb283ece07697513838019db4923a"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.780148 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" event={"ID":"a1df0ec3-aa36-4685-8c9b-586beaf71340","Type":"ContainerStarted","Data":"1ecbec0a38931ace3380df0462e163f887586218ae597dd9318f161254b7ea8f"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.809014 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjlp2" podStartSLOduration=123.808989913 podStartE2EDuration="2m3.808989913s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.79699968 +0000 UTC m=+145.048587186" watchObservedRunningTime="2026-02-21 00:10:09.808989913 +0000 UTC m=+145.060577419" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.813792 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" event={"ID":"3d439193-5483-476d-9612-99dfb4558121","Type":"ContainerStarted","Data":"85295c04bc27f84cea9b4a5dbd66a609c5a37812cc6f43eb6c2af48ebcbd445e"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.815849 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6578b" event={"ID":"6485254f-aeb0-4df4-ba6e-c9eaa0718933","Type":"ContainerStarted","Data":"af2bdad5605ee3c4a88071b01941f26863b56d1d67f197037ba06b168348dd61"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.821422 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" podStartSLOduration=123.821401878 podStartE2EDuration="2m3.821401878s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.820389339 +0000 UTC m=+145.071976845" watchObservedRunningTime="2026-02-21 00:10:09.821401878 +0000 UTC m=+145.072989384" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.825905 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" event={"ID":"6599a587-de25-4c8d-82e6-102438cd2547","Type":"ContainerStarted","Data":"5164fec40af62dc5f075ecc8a59d69a2479661be7eeb5d684b419eb0a2c1d0af"} Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.844218 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" podStartSLOduration=124.844199129 podStartE2EDuration="2m4.844199129s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:09.841881033 +0000 UTC m=+145.093468539" watchObservedRunningTime="2026-02-21 00:10:09.844199129 +0000 UTC m=+145.095786635" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.846523 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.848845 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.348827252 +0000 UTC m=+145.600414758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.947614 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.948604 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.448588354 +0000 UTC m=+145.700175860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.949473 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:09 crc kubenswrapper[4906]: E0221 00:10:09.951465 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.451453006 +0000 UTC m=+145.703040512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.961104 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:09 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:09 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:09 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.961153 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:09 crc kubenswrapper[4906]: I0221 00:10:09.973323 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.051052 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.052049 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.55161117 +0000 UTC m=+145.803198676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.122782 4906 csr.go:261] certificate signing request csr-lf7mz is approved, waiting to be issued Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.129836 4906 csr.go:257] certificate signing request csr-lf7mz is issued Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.153352 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.153656 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.653641887 +0000 UTC m=+145.905229393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.254984 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.255237 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.755207751 +0000 UTC m=+146.006795257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.255621 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.256097 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.756073766 +0000 UTC m=+146.007661332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.356261 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.356659 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.856644941 +0000 UTC m=+146.108232447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.460699 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.461136 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:10.961120869 +0000 UTC m=+146.212708375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.562194 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.563110 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.063091114 +0000 UTC m=+146.314678620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.664162 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.664591 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.164574876 +0000 UTC m=+146.416162372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.764715 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.764914 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.264888034 +0000 UTC m=+146.516475540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.765144 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.765478 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.265462841 +0000 UTC m=+146.517050347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.858800 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" event={"ID":"eb99950d-0dff-4eba-aa36-f369fc4fdca6","Type":"ContainerStarted","Data":"37ce6c58debff480b85674cf3ea7718a94d4018bf1aa8d8531845f0073d483ee"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.858847 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" event={"ID":"eb99950d-0dff-4eba-aa36-f369fc4fdca6","Type":"ContainerStarted","Data":"4f768bf560075c870e0b6352f13cc2dd13b69f664bd7e75238651a95332016bc"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.867304 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.867531 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.367513048 +0000 UTC m=+146.619100554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.868754 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.869585 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.369572607 +0000 UTC m=+146.621160113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.869749 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" event={"ID":"e8bebf2a-8dd4-4ddc-a74a-e13b789e7674","Type":"ContainerStarted","Data":"a5d22ed1b348fa030d57f20dd0bdbec48111a7d7e2dd0126f4a85c0ecdaefd61"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.871871 4906 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tn7tl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.872328 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" podUID="e8bebf2a-8dd4-4ddc-a74a-e13b789e7674" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.872927 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" event={"ID":"c83a888c-071a-44b4-be7a-e1e2d2bb638e","Type":"ContainerStarted","Data":"bef4ae46e9780774eb1413b28df206b4c8162e3d80c6462de78ead9702e11b81"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.876600 4906 generic.go:334] "Generic (PLEG): container finished" podID="6485254f-aeb0-4df4-ba6e-c9eaa0718933" containerID="72ac89e7381c24a329783d72a7fdf13136071e80e6ebf539497266dcb96da186" exitCode=0 Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.876759 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6578b" event={"ID":"6485254f-aeb0-4df4-ba6e-c9eaa0718933","Type":"ContainerDied","Data":"72ac89e7381c24a329783d72a7fdf13136071e80e6ebf539497266dcb96da186"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.881488 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-84ldr" podStartSLOduration=124.881472868 podStartE2EDuration="2m4.881472868s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:10.880664595 +0000 UTC m=+146.132252101" watchObservedRunningTime="2026-02-21 00:10:10.881472868 +0000 UTC m=+146.133060374" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.888825 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lv2pm" event={"ID":"96f08c18-0ef6-469c-a9bb-3fa51058fb4d","Type":"ContainerStarted","Data":"102bfb4f6cb3108e7b4ce93f7fc974e05fc89749e435719904dfd03b299fe54d"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.906206 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" event={"ID":"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06","Type":"ContainerStarted","Data":"679ef367960307eab5bee36fb02f51671045c10eb80bc559c70ad31e3a920a08"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.906249 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" event={"ID":"6f649ea1-47f1-4ae4-8b05-a5b5e2503b06","Type":"ContainerStarted","Data":"74ff84925c11f79c5a896432d87bd28a61db8852113bf90c2755a0fa72e4f0fb"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.929635 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" event={"ID":"4ac75298-9ac5-4b16-8110-455c43d00945","Type":"ContainerStarted","Data":"f2826cce6790aebb13130054a5d74a02f46768ff17f5d817536a067002c00196"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.931669 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" event={"ID":"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0","Type":"ContainerStarted","Data":"591825a1b5607ef41e2053dfde1126e5bca02acf100fb7ad0fb1bf8aaf2e0149"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.931709 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" event={"ID":"b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0","Type":"ContainerStarted","Data":"ff7e7c291f4ab68a84a0de2f78b1340fc6ba9220a9e92a267c6aecfd44dc1f95"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.932364 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.934639 4906 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-79w9t container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.934785 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" podUID="b2adbffd-6f0b-4fa3-98b2-eb9ebfb307d0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.949569 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" event={"ID":"52481e6d-f985-4177-a719-d12248f049ac","Type":"ContainerStarted","Data":"094263ca9eff27a4337ee7ee192bdf752d53d9ea4418bd468b8af808e11028b3"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.949609 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" event={"ID":"52481e6d-f985-4177-a719-d12248f049ac","Type":"ContainerStarted","Data":"c01b473b6d333170551695dd527b29bf84422f2d1fa9eaf532b6f2f89454437f"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.956978 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" event={"ID":"314520d4-89fd-44fc-8eff-57534f58a1d5","Type":"ContainerStarted","Data":"3e5c0edc6ce4cba26a984d69c440af4f7852dad9bb6d8c0e0d9316fdb4730e13"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.957036 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" event={"ID":"314520d4-89fd-44fc-8eff-57534f58a1d5","Type":"ContainerStarted","Data":"d7d6e63d2b082452d0598a67dfffeaee60ecab1533546da820f78ef554bdd102"} Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.957838 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.964942 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:10 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:10 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:10 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.965003 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.967274 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhvm5" podStartSLOduration=124.96726237 podStartE2EDuration="2m4.96726237s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:10.927947656 +0000 UTC m=+146.179535162" watchObservedRunningTime="2026-02-21 00:10:10.96726237 +0000 UTC m=+146.218849876" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.967959 4906 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-w9crs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:5443/healthz\": dial tcp 10.217.0.15:5443: connect: connection refused" start-of-body= Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.967981 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" podUID="314520d4-89fd-44fc-8eff-57534f58a1d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.15:5443/healthz\": dial tcp 10.217.0.15:5443: connect: connection refused" Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.973966 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.974875 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.474836917 +0000 UTC m=+146.726424433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:10 crc kubenswrapper[4906]: I0221 00:10:10.976109 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:10 crc kubenswrapper[4906]: E0221 00:10:10.977250 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.477240816 +0000 UTC m=+146.728828312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.007903 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" event={"ID":"6599a587-de25-4c8d-82e6-102438cd2547","Type":"ContainerStarted","Data":"4942dea3dd5aa7944bac701f512075dfbe422cf64f43bcfea7952d29344e0353"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.007950 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" event={"ID":"6599a587-de25-4c8d-82e6-102438cd2547","Type":"ContainerStarted","Data":"0e940f1086365c0a3a87abe737717a81ef5c3a58f05b383983e1114f4a9636e8"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.025352 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29527200-5vtx4" event={"ID":"eb81731c-9eb7-4bf7-a263-a2117fabb5cc","Type":"ContainerStarted","Data":"86b63b2adfd59d33ed759387a0da097447cf23b057f3e85341c69aaeb752bf71"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.025425 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29527200-5vtx4" event={"ID":"eb81731c-9eb7-4bf7-a263-a2117fabb5cc","Type":"ContainerStarted","Data":"1806e6738408a8f1264606ff0f51915018a4f262d32476698e91ee71b4f4d35d"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.030142 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxwm4" podStartSLOduration=125.030127138 podStartE2EDuration="2m5.030127138s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:10.993915453 +0000 UTC m=+146.245502959" watchObservedRunningTime="2026-02-21 00:10:11.030127138 +0000 UTC m=+146.281714644" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.031424 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" podStartSLOduration=125.031418235 podStartE2EDuration="2m5.031418235s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.029572172 +0000 UTC m=+146.281159678" watchObservedRunningTime="2026-02-21 00:10:11.031418235 +0000 UTC m=+146.283005741" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.032726 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gqmhh" event={"ID":"69df0bca-c432-4fea-b47d-93a194a58243","Type":"ContainerStarted","Data":"9fd21ff92b2a3a7722cee6873b75a6bf404525c6109f868244170c02a5bbc978"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.032782 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gqmhh" event={"ID":"69df0bca-c432-4fea-b47d-93a194a58243","Type":"ContainerStarted","Data":"a324e0ccf39247e03dac37d9eac0ff8e99c7b8884d67da863471bbc98d5013eb"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.032887 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.034228 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fdrrq" event={"ID":"b6ee1e80-1e94-4513-903d-0478a92070b4","Type":"ContainerStarted","Data":"a14f8e08fc896da8dddda2979011b6ddae6a5bc92283711edb353c8681f8edd2"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.043280 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" event={"ID":"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e","Type":"ContainerStarted","Data":"ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.043497 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" event={"ID":"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e","Type":"ContainerStarted","Data":"7eb5ae6babfcab802c3e1c8f31b07512f65c6eaa1d3d67ec71baf3e3d6e451e8"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.044783 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.049825 4906 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fm5j7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.049890 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.058452 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x64jb" event={"ID":"43852740-4aee-4dcb-b90f-13e39c49ecc8","Type":"ContainerStarted","Data":"1a2fa4379961aae05419b0d61f1dfc6e3e5cd8f92f9c3e29e6458feee1b2ce7e"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.068178 4906 generic.go:334] "Generic (PLEG): container finished" podID="3d439193-5483-476d-9612-99dfb4558121" containerID="bd01d2c4da59947ea289a95297807e488871eb0f80a6f339b96be8bf17b83c89" exitCode=0 Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.068294 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" event={"ID":"3d439193-5483-476d-9612-99dfb4558121","Type":"ContainerDied","Data":"bd01d2c4da59947ea289a95297807e488871eb0f80a6f339b96be8bf17b83c89"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.079361 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.080588 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.58057365 +0000 UTC m=+146.832161156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.104867 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" event={"ID":"1b800839-991f-4aa2-be47-6cef1e5c81d0","Type":"ContainerStarted","Data":"5ea2170e14ecb21e24f359eebd363624af4d926460b0551ae3e5de9bf682ce2a"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.105371 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" event={"ID":"1b800839-991f-4aa2-be47-6cef1e5c81d0","Type":"ContainerStarted","Data":"b89160c9b02b6d2b5907d9ec992582b62c026b7bd13b1427f0e07e6fd9acf6d5"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.105466 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.115326 4906 patch_prober.go:28] interesting pod/console-operator-58897d9998-rj6xk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.115388 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" podUID="1b800839-991f-4aa2-be47-6cef1e5c81d0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.116158 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" event={"ID":"3094818f-2083-4b5f-bf1d-691731224abd","Type":"ContainerStarted","Data":"f4be03db95369dbc616f472f5ae02f46bbd3457887ffa2bccca3a4147a2231cf"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.116197 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" event={"ID":"3094818f-2083-4b5f-bf1d-691731224abd","Type":"ContainerStarted","Data":"86298339f1f22cb33b09fc15067d168c5d5dbfad7f5b75eb39df1801f78e9e28"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.116209 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" event={"ID":"3094818f-2083-4b5f-bf1d-691731224abd","Type":"ContainerStarted","Data":"96c939f993cec634e11550635870862de27776efbb3703941518c99191735c01"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.127514 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" podStartSLOduration=125.127498402 podStartE2EDuration="2m5.127498402s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.125977809 +0000 UTC m=+146.377565315" watchObservedRunningTime="2026-02-21 00:10:11.127498402 +0000 UTC m=+146.379085908" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.128640 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fqmc8" podStartSLOduration=125.128631234 podStartE2EDuration="2m5.128631234s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.06797636 +0000 UTC m=+146.319563866" watchObservedRunningTime="2026-02-21 00:10:11.128631234 +0000 UTC m=+146.380218740" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.131125 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-21 00:05:10 +0000 UTC, rotation deadline is 2026-12-19 19:45:46.033942433 +0000 UTC Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.131147 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7243h35m34.902798077s for next certificate rotation Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.138636 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" event={"ID":"6ff05c62-73da-4640-8c3f-8b846c14296c","Type":"ContainerStarted","Data":"a1f34b9bc5481d9a85397b69aff6b92ed71675478e4401a5d6802fe5de2b6d72"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.138725 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" event={"ID":"6ff05c62-73da-4640-8c3f-8b846c14296c","Type":"ContainerStarted","Data":"4efbdff2c070751550a1417000ba51ebb87a30cdc55c40d42a37daf443eed73e"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.144745 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8l88x" event={"ID":"27421dfc-6497-4ba3-8887-f5dc90ce57d9","Type":"ContainerStarted","Data":"f56679fe1b0a14e07f2d30fa85dc81d043824fe4bb2a99f5ff002e4cb0c069b0"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.144801 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8l88x" event={"ID":"27421dfc-6497-4ba3-8887-f5dc90ce57d9","Type":"ContainerStarted","Data":"f9741b837575da6ea392da1462a80c0b35f816b6afc436ba46e622350d422dc4"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.147902 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" podStartSLOduration=125.147891315 podStartE2EDuration="2m5.147891315s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.145812996 +0000 UTC m=+146.397400502" watchObservedRunningTime="2026-02-21 00:10:11.147891315 +0000 UTC m=+146.399478821" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.172288 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" event={"ID":"37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb","Type":"ContainerStarted","Data":"baf7894433811dc8ef928aab214cbe885e7385e7ed53010d8fc451cb4ffa672a"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.172333 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" event={"ID":"37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb","Type":"ContainerStarted","Data":"5a7db864f173e4f5aac25c30b7cd52d949d0f94d8ac3315fc28fd447dfe5701d"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.172343 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" event={"ID":"37fa17e7-c0eb-4d5a-a36e-fe257b56e1cb","Type":"ContainerStarted","Data":"c1311534e67a55f4bc6d07e171e7b8685b053dcdee4b78b4ff87c768757f28f8"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.175241 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" event={"ID":"a86819ff-277a-4ef9-8bef-3de42f9406fc","Type":"ContainerStarted","Data":"7006212760d6b1923953a3bc22bc59a8edfb273d424b8e409b02d4fb637f93c4"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.175280 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" event={"ID":"a86819ff-277a-4ef9-8bef-3de42f9406fc","Type":"ContainerStarted","Data":"a041d6a1f7bf89a1ee1ac3fe66e538fe26f95a08d0db5968f406c81c8c09e870"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.177182 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" event={"ID":"a1df0ec3-aa36-4685-8c9b-586beaf71340","Type":"ContainerStarted","Data":"4cd0bad5a7dd948d7d49f9ebcbbbc9bad1591d8af56e3a609304455b7143a1b5"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.178772 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" event={"ID":"cd59cacf-f27e-4731-9dca-59b01415316f","Type":"ContainerStarted","Data":"8e457c8ca1f1486be36e4bbe5988a372b896ec59d89482f0af3332334d23b8b0"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.178798 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" event={"ID":"cd59cacf-f27e-4731-9dca-59b01415316f","Type":"ContainerStarted","Data":"72bba6b0b711b5132a33f1ab82fa52a34ff57c7ed0bd1bef9a37ddc7c7712383"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.181066 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.183196 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v8z45" podStartSLOduration=125.183178454 podStartE2EDuration="2m5.183178454s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.164789968 +0000 UTC m=+146.416377474" watchObservedRunningTime="2026-02-21 00:10:11.183178454 +0000 UTC m=+146.434765950" Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.183523 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.683508203 +0000 UTC m=+146.935095709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.184446 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" podStartSLOduration=126.18444137 podStartE2EDuration="2m6.18444137s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.182380721 +0000 UTC m=+146.433968237" watchObservedRunningTime="2026-02-21 00:10:11.18444137 +0000 UTC m=+146.436028876" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.190839 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" event={"ID":"ba583b66-6dbd-415a-b113-873eb19c4d4c","Type":"ContainerStarted","Data":"2dbf19620ca699216211f1cbe773429368b0d97df5f11663ab26b8b0cc2ca63a"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.190884 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" event={"ID":"ba583b66-6dbd-415a-b113-873eb19c4d4c","Type":"ContainerStarted","Data":"5dfcf8117e1a74203ca84eff86a0c38b4c422900c86c405881338fbc559d544f"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.193931 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" event={"ID":"ee0bafe0-cacc-4996-9675-a87c77f3984b","Type":"ContainerStarted","Data":"b2d78cfd399fc1112e821c2b2c3a1b51eda5f295a6aaed920635965cffb24650"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.194144 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.199056 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" event={"ID":"e0f1e160-28c6-4ff5-8b24-8c962f120747","Type":"ContainerStarted","Data":"20e4fbf06d46e679cf87d69a35a81277af7a8c3e978e9ad9b1a12951f78dade2"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.200836 4906 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9kg2j container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.41:6443/healthz\": dial tcp 10.217.0.41:6443: connect: connection refused" start-of-body= Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.200872 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" podUID="e0f1e160-28c6-4ff5-8b24-8c962f120747" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.41:6443/healthz\": dial tcp 10.217.0.41:6443: connect: connection refused" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.209893 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" event={"ID":"51c6b6f9-3b1f-4028-acc1-dac92556b401","Type":"ContainerStarted","Data":"9d8a3f28e41ce77eee2ef8506d420ba9ff55b49b9a5cfe7dc9554d2020296024"} Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.211784 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-84vm6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.211814 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-84vm6" podUID="4145877d-bf92-4ebc-8552-df3e4680eaf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.280170 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29527200-5vtx4" podStartSLOduration=126.280145027 podStartE2EDuration="2m6.280145027s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.243236241 +0000 UTC m=+146.494823747" watchObservedRunningTime="2026-02-21 00:10:11.280145027 +0000 UTC m=+146.531732533" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.281177 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxg6m" podStartSLOduration=125.281169276 podStartE2EDuration="2m5.281169276s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.272936651 +0000 UTC m=+146.524524177" watchObservedRunningTime="2026-02-21 00:10:11.281169276 +0000 UTC m=+146.532756782" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.285845 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.302589 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.802564118 +0000 UTC m=+147.054151624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.303007 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gqmhh" podStartSLOduration=8.30299124 podStartE2EDuration="8.30299124s" podCreationTimestamp="2026-02-21 00:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.302933758 +0000 UTC m=+146.554521264" watchObservedRunningTime="2026-02-21 00:10:11.30299124 +0000 UTC m=+146.554578746" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.372831 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6v9gm" podStartSLOduration=125.372816796 podStartE2EDuration="2m5.372816796s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.340008398 +0000 UTC m=+146.591595914" watchObservedRunningTime="2026-02-21 00:10:11.372816796 +0000 UTC m=+146.624404302" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.373746 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vmpkj" podStartSLOduration=126.373740713 podStartE2EDuration="2m6.373740713s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.372067115 +0000 UTC m=+146.623654621" watchObservedRunningTime="2026-02-21 00:10:11.373740713 +0000 UTC m=+146.625328219" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.391016 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.391412 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.891399198 +0000 UTC m=+147.142986704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.407301 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gw6k" podStartSLOduration=126.407286672 podStartE2EDuration="2m6.407286672s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.405370577 +0000 UTC m=+146.656958083" watchObservedRunningTime="2026-02-21 00:10:11.407286672 +0000 UTC m=+146.658874178" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.458362 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" podStartSLOduration=125.458346352 podStartE2EDuration="2m5.458346352s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.457184459 +0000 UTC m=+146.708771965" watchObservedRunningTime="2026-02-21 00:10:11.458346352 +0000 UTC m=+146.709933858" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.458799 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjnhh" podStartSLOduration=125.458793915 podStartE2EDuration="2m5.458793915s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.42609469 +0000 UTC m=+146.677682186" watchObservedRunningTime="2026-02-21 00:10:11.458793915 +0000 UTC m=+146.710381421" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.480168 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hlxkl" podStartSLOduration=125.480152135 podStartE2EDuration="2m5.480152135s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.478650772 +0000 UTC m=+146.730238278" watchObservedRunningTime="2026-02-21 00:10:11.480152135 +0000 UTC m=+146.731739641" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.485859 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.485917 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.498184 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.498342 4906 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-r9sr7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.498402 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" podUID="51c6b6f9-3b1f-4028-acc1-dac92556b401" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.498589 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:11.998572552 +0000 UTC m=+147.250160058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.508020 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" podStartSLOduration=125.507987691 podStartE2EDuration="2m5.507987691s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.504662876 +0000 UTC m=+146.756250382" watchObservedRunningTime="2026-02-21 00:10:11.507987691 +0000 UTC m=+146.759575197" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.535267 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8l88x" podStartSLOduration=7.535245441 podStartE2EDuration="7.535245441s" podCreationTimestamp="2026-02-21 00:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:11.531491493 +0000 UTC m=+146.783078999" watchObservedRunningTime="2026-02-21 00:10:11.535245441 +0000 UTC m=+146.786832947" Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.599786 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.600213 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.100199828 +0000 UTC m=+147.351787334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.704236 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.704965 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.204948233 +0000 UTC m=+147.456535739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.806639 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.806993 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.30697975 +0000 UTC m=+147.558567256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.908741 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.908920 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.408894224 +0000 UTC m=+147.660481730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.909197 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:11 crc kubenswrapper[4906]: E0221 00:10:11.909576 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.409555243 +0000 UTC m=+147.661142749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.963425 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:11 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:11 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:11 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:11 crc kubenswrapper[4906]: I0221 00:10:11.963495 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.010557 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.010753 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.510721784 +0000 UTC m=+147.762309301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.010892 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.011263 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.5112546 +0000 UTC m=+147.762842106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.112533 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.112788 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.612759122 +0000 UTC m=+147.864346628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.112894 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.113213 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.613205805 +0000 UTC m=+147.864793311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.213589 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.213709 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.713672807 +0000 UTC m=+147.965260313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.214166 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.214451 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.714436499 +0000 UTC m=+147.966024005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.214736 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gqmhh" event={"ID":"69df0bca-c432-4fea-b47d-93a194a58243","Type":"ContainerStarted","Data":"1d7f74aa9269760ae3667c1642b3a868b1ea5c5a0188a0d0cec84383b2ff6ab6"} Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.216896 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" event={"ID":"3d439193-5483-476d-9612-99dfb4558121","Type":"ContainerStarted","Data":"b30d4621e5e61c386186f1662030607820291fe77c52c5388ed5fff8a8732f17"} Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.217066 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.218903 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" event={"ID":"52481e6d-f985-4177-a719-d12248f049ac","Type":"ContainerStarted","Data":"cff2efd4b39c7d5d344f289423ed5bfb4a9c117234b35415665aaa7d9e66ef76"} Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.221606 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6578b" event={"ID":"6485254f-aeb0-4df4-ba6e-c9eaa0718933","Type":"ContainerStarted","Data":"a5fdc663a909962b7d4bc39dd2190e050ba01fd851d62b3639d9d477a59c2fe6"} Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.221660 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6578b" event={"ID":"6485254f-aeb0-4df4-ba6e-c9eaa0718933","Type":"ContainerStarted","Data":"0323fe2e5317dbd989370c08749293b6906e56465e92ca42c35977469ad39a5a"} Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.223292 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x64jb" event={"ID":"43852740-4aee-4dcb-b90f-13e39c49ecc8","Type":"ContainerStarted","Data":"e3c52f9fa61d7ec630a9de2addce2b2d5ed65559adc2891b8d826f69f13b4ed5"} Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.224392 4906 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fm5j7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.224401 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-84vm6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.224431 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.224439 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-84vm6" podUID="4145877d-bf92-4ebc-8552-df3e4680eaf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.229825 4906 patch_prober.go:28] interesting pod/console-operator-58897d9998-rj6xk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.229879 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" podUID="1b800839-991f-4aa2-be47-6cef1e5c81d0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.235110 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.238837 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" podStartSLOduration=127.238806686 podStartE2EDuration="2m7.238806686s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:12.237645793 +0000 UTC m=+147.489233299" watchObservedRunningTime="2026-02-21 00:10:12.238806686 +0000 UTC m=+147.490394192" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.250037 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tn7tl" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.291449 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6578b" podStartSLOduration=126.291429401 podStartE2EDuration="2m6.291429401s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:12.289535946 +0000 UTC m=+147.541123452" watchObservedRunningTime="2026-02-21 00:10:12.291429401 +0000 UTC m=+147.543016907" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.323045 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.323195 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.823164788 +0000 UTC m=+148.074752294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.323647 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.329266 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.829249792 +0000 UTC m=+148.080837298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.392940 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mjrk7" podStartSLOduration=126.392926243 podStartE2EDuration="2m6.392926243s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:12.371360616 +0000 UTC m=+147.622948122" watchObservedRunningTime="2026-02-21 00:10:12.392926243 +0000 UTC m=+147.644513749" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.430806 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.431122 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:12.931106664 +0000 UTC m=+148.182694170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.478126 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-79w9t" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.533307 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.533589 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.033576924 +0000 UTC m=+148.285164430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.634325 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.634701 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.134663635 +0000 UTC m=+148.386251151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.736362 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.737070 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.237058512 +0000 UTC m=+148.488646008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.837356 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.837718 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.33769595 +0000 UTC m=+148.589283456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.882764 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9crs" Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.939094 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:12 crc kubenswrapper[4906]: E0221 00:10:12.939412 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.439398898 +0000 UTC m=+148.690986404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.964151 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:12 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:12 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:12 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:12 crc kubenswrapper[4906]: I0221 00:10:12.964199 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.040509 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.040856 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.540841408 +0000 UTC m=+148.792428914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.123769 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.123826 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.141714 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.142083 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.642070663 +0000 UTC m=+148.893658159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.242719 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.242863 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.742835794 +0000 UTC m=+148.994423300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.242999 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.243317 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.743305657 +0000 UTC m=+148.994893233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.248422 4906 generic.go:334] "Generic (PLEG): container finished" podID="a1df0ec3-aa36-4685-8c9b-586beaf71340" containerID="4cd0bad5a7dd948d7d49f9ebcbbbc9bad1591d8af56e3a609304455b7143a1b5" exitCode=0 Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.248477 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" event={"ID":"a1df0ec3-aa36-4685-8c9b-586beaf71340","Type":"ContainerDied","Data":"4cd0bad5a7dd948d7d49f9ebcbbbc9bad1591d8af56e3a609304455b7143a1b5"} Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.254746 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x64jb" event={"ID":"43852740-4aee-4dcb-b90f-13e39c49ecc8","Type":"ContainerStarted","Data":"830f172d45565d431d4582005fdca394106c81b7f35280a9bef347884ff40401"} Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.322959 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.344273 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.344453 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.844427328 +0000 UTC m=+149.096014834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.346363 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.346858 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.846840937 +0000 UTC m=+149.098428443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.447475 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.447891 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:13.947875736 +0000 UTC m=+149.199463242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.548367 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.548723 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:14.048704579 +0000 UTC m=+149.300292085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.610500 4906 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.649245 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.649433 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.649481 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.649504 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.649553 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.650226 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 00:10:14.150188761 +0000 UTC m=+149.401776267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.650739 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.655777 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.655928 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.671452 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.751003 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:13 crc kubenswrapper[4906]: E0221 00:10:13.751506 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 00:10:14.251479507 +0000 UTC m=+149.503067043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2xzp4" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.791724 4906 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-21T00:10:13.610526587Z","Handler":null,"Name":""} Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.804578 4906 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.804616 4906 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.847918 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.856314 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.860250 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.864493 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.873953 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.927832 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ptwns"] Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.929128 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.931435 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.943139 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ptwns"] Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.958270 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.965811 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:13 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:13 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:13 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.965856 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.970597 4906 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 21 00:10:13 crc kubenswrapper[4906]: I0221 00:10:13.970644 4906 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.054553 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2xzp4\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.059740 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-utilities\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.059783 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9flm\" (UniqueName: \"kubernetes.io/projected/2ac4beba-19e0-4278-9701-d830f33d6688-kube-api-access-q9flm\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.059808 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-catalog-content\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.063004 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.126301 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zf44m"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.127867 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.131217 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.161400 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-utilities\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.161449 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9flm\" (UniqueName: \"kubernetes.io/projected/2ac4beba-19e0-4278-9701-d830f33d6688-kube-api-access-q9flm\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.161476 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-catalog-content\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.161977 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-catalog-content\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.162298 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-utilities\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.197209 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9flm\" (UniqueName: \"kubernetes.io/projected/2ac4beba-19e0-4278-9701-d830f33d6688-kube-api-access-q9flm\") pod \"community-operators-ptwns\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.200390 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zf44m"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.249896 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.263254 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-catalog-content\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.263288 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-utilities\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.263323 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcnsh\" (UniqueName: \"kubernetes.io/projected/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-kube-api-access-bcnsh\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.264894 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"31397000e4341e0c749a06f8ce01500e60be7b5fc4da551a13b5935583b6a5c7"} Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.266132 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e8bec2bae70f0ad10583ca4bebb245f6884823a4aede639526ea2d71a775717f"} Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.277973 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x64jb" event={"ID":"43852740-4aee-4dcb-b90f-13e39c49ecc8","Type":"ContainerStarted","Data":"8cd038e4b58ad85ee1fb7eb8eed2eb98900f25993e5185d5d168c9ddb7d298d8"} Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.278030 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x64jb" event={"ID":"43852740-4aee-4dcb-b90f-13e39c49ecc8","Type":"ContainerStarted","Data":"83830f14b9f3edf54cddace66d6c69017f03cf057be7e5930f77b8d6572a032c"} Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.302343 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-x64jb" podStartSLOduration=11.302323327 podStartE2EDuration="11.302323327s" podCreationTimestamp="2026-02-21 00:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:14.296870671 +0000 UTC m=+149.548458197" watchObservedRunningTime="2026-02-21 00:10:14.302323327 +0000 UTC m=+149.553910853" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.326347 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gbrvn"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.334661 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gbrvn"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.334788 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.373300 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-catalog-content\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.373372 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-utilities\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.373436 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcnsh\" (UniqueName: \"kubernetes.io/projected/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-kube-api-access-bcnsh\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.374244 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-catalog-content\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.374822 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-utilities\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.389558 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2xzp4"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.400279 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcnsh\" (UniqueName: \"kubernetes.io/projected/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-kube-api-access-bcnsh\") pod \"certified-operators-zf44m\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.451893 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.480817 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-catalog-content\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.480913 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-utilities\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.480992 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8vj\" (UniqueName: \"kubernetes.io/projected/73edbe76-4cd4-45e2-903c-8329ddfa9f58-kube-api-access-wd8vj\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.533759 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m8xm9"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.535180 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.538662 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m8xm9"] Feb 21 00:10:14 crc kubenswrapper[4906]: W0221 00:10:14.575312 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac4beba_19e0_4278_9701_d830f33d6688.slice/crio-2a75c1f1ecf8eba4e78df89cb0639ffbdf3a5066a258dc84693ac107fa6aaa8f WatchSource:0}: Error finding container 2a75c1f1ecf8eba4e78df89cb0639ffbdf3a5066a258dc84693ac107fa6aaa8f: Status 404 returned error can't find the container with id 2a75c1f1ecf8eba4e78df89cb0639ffbdf3a5066a258dc84693ac107fa6aaa8f Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.576267 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.576914 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ptwns"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.582035 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8vj\" (UniqueName: \"kubernetes.io/projected/73edbe76-4cd4-45e2-903c-8329ddfa9f58-kube-api-access-wd8vj\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.582078 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-catalog-content\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.582126 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-utilities\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.582453 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-utilities\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.582489 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-catalog-content\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.604229 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8vj\" (UniqueName: \"kubernetes.io/projected/73edbe76-4cd4-45e2-903c-8329ddfa9f58-kube-api-access-wd8vj\") pod \"community-operators-gbrvn\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.673256 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.683105 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1df0ec3-aa36-4685-8c9b-586beaf71340-config-volume\") pod \"a1df0ec3-aa36-4685-8c9b-586beaf71340\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.683450 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1df0ec3-aa36-4685-8c9b-586beaf71340-secret-volume\") pod \"a1df0ec3-aa36-4685-8c9b-586beaf71340\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.683473 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t46q\" (UniqueName: \"kubernetes.io/projected/a1df0ec3-aa36-4685-8c9b-586beaf71340-kube-api-access-5t46q\") pod \"a1df0ec3-aa36-4685-8c9b-586beaf71340\" (UID: \"a1df0ec3-aa36-4685-8c9b-586beaf71340\") " Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.683605 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-utilities\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.683626 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpwgs\" (UniqueName: \"kubernetes.io/projected/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-kube-api-access-kpwgs\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.683670 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-catalog-content\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.684171 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1df0ec3-aa36-4685-8c9b-586beaf71340-config-volume" (OuterVolumeSpecName: "config-volume") pod "a1df0ec3-aa36-4685-8c9b-586beaf71340" (UID: "a1df0ec3-aa36-4685-8c9b-586beaf71340"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.688465 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1df0ec3-aa36-4685-8c9b-586beaf71340-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a1df0ec3-aa36-4685-8c9b-586beaf71340" (UID: "a1df0ec3-aa36-4685-8c9b-586beaf71340"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.693362 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1df0ec3-aa36-4685-8c9b-586beaf71340-kube-api-access-5t46q" (OuterVolumeSpecName: "kube-api-access-5t46q") pod "a1df0ec3-aa36-4685-8c9b-586beaf71340" (UID: "a1df0ec3-aa36-4685-8c9b-586beaf71340"). InnerVolumeSpecName "kube-api-access-5t46q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.730036 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zf44m"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.788530 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-catalog-content\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.788870 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-utilities\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.788900 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpwgs\" (UniqueName: \"kubernetes.io/projected/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-kube-api-access-kpwgs\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.788966 4906 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1df0ec3-aa36-4685-8c9b-586beaf71340-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.788983 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t46q\" (UniqueName: \"kubernetes.io/projected/a1df0ec3-aa36-4685-8c9b-586beaf71340-kube-api-access-5t46q\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.788995 4906 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1df0ec3-aa36-4685-8c9b-586beaf71340-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.789926 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-utilities\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.790424 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-catalog-content\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.810896 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpwgs\" (UniqueName: \"kubernetes.io/projected/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-kube-api-access-kpwgs\") pod \"certified-operators-m8xm9\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.871439 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.914398 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gbrvn"] Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.962369 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:14 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:14 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:14 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:14 crc kubenswrapper[4906]: I0221 00:10:14.962436 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:14 crc kubenswrapper[4906]: W0221 00:10:14.970791 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73edbe76_4cd4_45e2_903c_8329ddfa9f58.slice/crio-d2314e5c868365760402b8a8885f9c20460cb0f6de91ff8a93d0e8757733f445 WatchSource:0}: Error finding container d2314e5c868365760402b8a8885f9c20460cb0f6de91ff8a93d0e8757733f445: Status 404 returned error can't find the container with id d2314e5c868365760402b8a8885f9c20460cb0f6de91ff8a93d0e8757733f445 Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.051710 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 21 00:10:15 crc kubenswrapper[4906]: E0221 00:10:15.051907 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1df0ec3-aa36-4685-8c9b-586beaf71340" containerName="collect-profiles" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.051918 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1df0ec3-aa36-4685-8c9b-586beaf71340" containerName="collect-profiles" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.052005 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1df0ec3-aa36-4685-8c9b-586beaf71340" containerName="collect-profiles" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.052325 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.053739 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.054036 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.071486 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.121399 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m8xm9"] Feb 21 00:10:15 crc kubenswrapper[4906]: W0221 00:10:15.136964 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69b5bb2_d2af_4aa6_a35e_9c96b3f78622.slice/crio-7d56d8e1f775ca54b5f7b567c49248398dc62ecfa0afb4e2dfd1c3db6560cecc WatchSource:0}: Error finding container 7d56d8e1f775ca54b5f7b567c49248398dc62ecfa0afb4e2dfd1c3db6560cecc: Status 404 returned error can't find the container with id 7d56d8e1f775ca54b5f7b567c49248398dc62ecfa0afb4e2dfd1c3db6560cecc Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.151291 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.151923 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.155572 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.157152 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.165274 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.199854 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f502088-4131-4a3a-ae5c-d4beff0db583-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f502088-4131-4a3a-ae5c-d4beff0db583\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.199907 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f502088-4131-4a3a-ae5c-d4beff0db583-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f502088-4131-4a3a-ae5c-d4beff0db583\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.283702 4906 generic.go:334] "Generic (PLEG): container finished" podID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerID="a967b27cf6b8d5317d66b7c8eecddc59d62789af43b80a2207af1c74e83cb6b3" exitCode=0 Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.283813 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbrvn" event={"ID":"73edbe76-4cd4-45e2-903c-8329ddfa9f58","Type":"ContainerDied","Data":"a967b27cf6b8d5317d66b7c8eecddc59d62789af43b80a2207af1c74e83cb6b3"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.283861 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbrvn" event={"ID":"73edbe76-4cd4-45e2-903c-8329ddfa9f58","Type":"ContainerStarted","Data":"d2314e5c868365760402b8a8885f9c20460cb0f6de91ff8a93d0e8757733f445"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.285898 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3521912656ee5d43282a4875cf94b8cd2e0b6d14f62c6dbb5981495a4b399794"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.286429 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.287094 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8xm9" event={"ID":"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622","Type":"ContainerStarted","Data":"7d56d8e1f775ca54b5f7b567c49248398dc62ecfa0afb4e2dfd1c3db6560cecc"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.289108 4906 generic.go:334] "Generic (PLEG): container finished" podID="2ac4beba-19e0-4278-9701-d830f33d6688" containerID="e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb" exitCode=0 Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.289142 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptwns" event={"ID":"2ac4beba-19e0-4278-9701-d830f33d6688","Type":"ContainerDied","Data":"e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.289165 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptwns" event={"ID":"2ac4beba-19e0-4278-9701-d830f33d6688","Type":"ContainerStarted","Data":"2a75c1f1ecf8eba4e78df89cb0639ffbdf3a5066a258dc84693ac107fa6aaa8f"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.291040 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7d8ef1067adc33fbf6faf1d3ef97c7f3a8560adbe603528a62023ff2034febe2"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.291534 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.295284 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.295318 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527200-nc4ns" event={"ID":"a1df0ec3-aa36-4685-8c9b-586beaf71340","Type":"ContainerDied","Data":"1ecbec0a38931ace3380df0462e163f887586218ae597dd9318f161254b7ea8f"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.295355 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ecbec0a38931ace3380df0462e163f887586218ae597dd9318f161254b7ea8f" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.297062 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" event={"ID":"cdadab89-f0cf-4bd7-af7e-17c67a65688a","Type":"ContainerStarted","Data":"3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.297100 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" event={"ID":"cdadab89-f0cf-4bd7-af7e-17c67a65688a","Type":"ContainerStarted","Data":"86e7a5b8618ec7c0d567f3ab48ab570af83b82d1864b1cb46276f6093e9c6295"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.297193 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.298639 4906 generic.go:334] "Generic (PLEG): container finished" podID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerID="bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71" exitCode=0 Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.298666 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf44m" event={"ID":"0c851d93-c82d-4d73-b456-47c8fa6e3f5d","Type":"ContainerDied","Data":"bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.298725 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf44m" event={"ID":"0c851d93-c82d-4d73-b456-47c8fa6e3f5d","Type":"ContainerStarted","Data":"4c58adc2e5b77a6204057f65839f7f4a9ad5a0bbac1380a769728daf824e331d"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.299999 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3ee85241607de53040227807fb6c8c22ebbe3756779bc3d02214a0666b525596"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.300021 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0ebfc4a500e9e7504300074d9bba043075561c078894e02ff294b430fd55e9e4"} Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.301059 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.301137 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f502088-4131-4a3a-ae5c-d4beff0db583-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f502088-4131-4a3a-ae5c-d4beff0db583\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.301173 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.301234 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f502088-4131-4a3a-ae5c-d4beff0db583-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f502088-4131-4a3a-ae5c-d4beff0db583\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.301372 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f502088-4131-4a3a-ae5c-d4beff0db583-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8f502088-4131-4a3a-ae5c-d4beff0db583\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.327226 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f502088-4131-4a3a-ae5c-d4beff0db583-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8f502088-4131-4a3a-ae5c-d4beff0db583\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.400116 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.404245 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.404349 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.406544 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.427555 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.465923 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" podStartSLOduration=129.465903106 podStartE2EDuration="2m9.465903106s" podCreationTimestamp="2026-02-21 00:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:15.458401012 +0000 UTC m=+150.709988528" watchObservedRunningTime="2026-02-21 00:10:15.465903106 +0000 UTC m=+150.717490612" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.470049 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.527533 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.692846 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.703408 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kf6pc" Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.800938 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 21 00:10:15 crc kubenswrapper[4906]: W0221 00:10:15.818566 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2f576461_ecf4_46ad_b7a0_ada0ee468ef0.slice/crio-0a4856951474d23ed0dd8e758fc86fb20813937c47a560304caec8cc94188972 WatchSource:0}: Error finding container 0a4856951474d23ed0dd8e758fc86fb20813937c47a560304caec8cc94188972: Status 404 returned error can't find the container with id 0a4856951474d23ed0dd8e758fc86fb20813937c47a560304caec8cc94188972 Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.960039 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:15 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:15 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:15 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:15 crc kubenswrapper[4906]: I0221 00:10:15.960350 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.123013 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5mlvx"] Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.124067 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.129444 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.129569 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mlvx"] Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.218964 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpsnw\" (UniqueName: \"kubernetes.io/projected/06df1df6-9fbb-40ed-a746-f90f10d5286f-kube-api-access-bpsnw\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.219601 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-utilities\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.219641 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-catalog-content\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.231055 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.232836 4906 patch_prober.go:28] interesting pod/console-f9d7485db-86b7j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.232873 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-86b7j" podUID="beecb71f-3791-44c8-bee4-83585ee82c14" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.235939 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.307445 4906 generic.go:334] "Generic (PLEG): container finished" podID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerID="364f42a3ab65178a37891c318024ee97e337f0e7ce56e0b773056b9fca4dfe8a" exitCode=0 Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.307508 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8xm9" event={"ID":"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622","Type":"ContainerDied","Data":"364f42a3ab65178a37891c318024ee97e337f0e7ce56e0b773056b9fca4dfe8a"} Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.314468 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f502088-4131-4a3a-ae5c-d4beff0db583","Type":"ContainerStarted","Data":"08249a07f184836b06649870b5c274d9494d54c3178ab0059b1bf941660f1af7"} Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.314505 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f502088-4131-4a3a-ae5c-d4beff0db583","Type":"ContainerStarted","Data":"3e481328b82fe6a1615a82590938e92a077f852023f4bfddfb74783e48f52616"} Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.315929 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2f576461-ecf4-46ad-b7a0-ada0ee468ef0","Type":"ContainerStarted","Data":"d7a8d416886443e6cfcbb485124009ea6d81106ed684deebf5c2f8373dfaaa9f"} Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.315978 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2f576461-ecf4-46ad-b7a0-ada0ee468ef0","Type":"ContainerStarted","Data":"0a4856951474d23ed0dd8e758fc86fb20813937c47a560304caec8cc94188972"} Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.320400 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpsnw\" (UniqueName: \"kubernetes.io/projected/06df1df6-9fbb-40ed-a746-f90f10d5286f-kube-api-access-bpsnw\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.320448 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-utilities\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.320470 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-catalog-content\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.321879 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-catalog-content\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.322002 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-utilities\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.342398 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.3423811159999999 podStartE2EDuration="1.342381116s" podCreationTimestamp="2026-02-21 00:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:16.340485712 +0000 UTC m=+151.592073238" watchObservedRunningTime="2026-02-21 00:10:16.342381116 +0000 UTC m=+151.593968622" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.344359 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpsnw\" (UniqueName: \"kubernetes.io/projected/06df1df6-9fbb-40ed-a746-f90f10d5286f-kube-api-access-bpsnw\") pod \"redhat-marketplace-5mlvx\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.361523 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.361501933 podStartE2EDuration="1.361501933s" podCreationTimestamp="2026-02-21 00:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:16.350806427 +0000 UTC m=+151.602393933" watchObservedRunningTime="2026-02-21 00:10:16.361501933 +0000 UTC m=+151.613089439" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.448995 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.495800 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.500825 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r9sr7" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.556446 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-84vm6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.556515 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-84vm6" podUID="4145877d-bf92-4ebc-8552-df3e4680eaf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.556588 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-84vm6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.556693 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-84vm6" podUID="4145877d-bf92-4ebc-8552-df3e4680eaf5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.573089 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gpk7r"] Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.574714 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.580077 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpk7r"] Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.685967 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.686011 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.693161 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:16 crc kubenswrapper[4906]: E0221 00:10:16.718358 4906 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod2f576461_ecf4_46ad_b7a0_ada0ee468ef0.slice/crio-d7a8d416886443e6cfcbb485124009ea6d81106ed684deebf5c2f8373dfaaa9f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod2f576461_ecf4_46ad_b7a0_ada0ee468ef0.slice/crio-conmon-d7a8d416886443e6cfcbb485124009ea6d81106ed684deebf5c2f8373dfaaa9f.scope\": RecentStats: unable to find data in memory cache]" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.729390 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-catalog-content\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.729491 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfstm\" (UniqueName: \"kubernetes.io/projected/92ef311b-1a77-4d80-be38-637a6aa7de11-kube-api-access-vfstm\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.729547 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-utilities\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.777675 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mlvx"] Feb 21 00:10:16 crc kubenswrapper[4906]: W0221 00:10:16.785466 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06df1df6_9fbb_40ed_a746_f90f10d5286f.slice/crio-d15f2d0777542ea02e91b3c6c3a21e171c7d6701a27fee9ceea650bcda6dba31 WatchSource:0}: Error finding container d15f2d0777542ea02e91b3c6c3a21e171c7d6701a27fee9ceea650bcda6dba31: Status 404 returned error can't find the container with id d15f2d0777542ea02e91b3c6c3a21e171c7d6701a27fee9ceea650bcda6dba31 Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.832233 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-catalog-content\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.833325 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-catalog-content\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.833771 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfstm\" (UniqueName: \"kubernetes.io/projected/92ef311b-1a77-4d80-be38-637a6aa7de11-kube-api-access-vfstm\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.833817 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-utilities\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.835311 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-utilities\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.852221 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfstm\" (UniqueName: \"kubernetes.io/projected/92ef311b-1a77-4d80-be38-637a6aa7de11-kube-api-access-vfstm\") pod \"redhat-marketplace-gpk7r\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.954178 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.958140 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.960867 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:16 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:16 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:16 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.960966 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:16 crc kubenswrapper[4906]: I0221 00:10:16.984427 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rj6xk" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.125841 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bpdlr"] Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.126919 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.163677 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.166761 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpdlr"] Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.266653 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-catalog-content\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.266739 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-utilities\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.266803 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2cm\" (UniqueName: \"kubernetes.io/projected/e2cdeb59-eeb8-4641-921e-05a840fd96fa-kube-api-access-fb2cm\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.325478 4906 generic.go:334] "Generic (PLEG): container finished" podID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerID="9a86890cabd0382954ee315ea5004a7295d3613ec572aae2e8908ce2e1f1741b" exitCode=0 Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.325584 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mlvx" event={"ID":"06df1df6-9fbb-40ed-a746-f90f10d5286f","Type":"ContainerDied","Data":"9a86890cabd0382954ee315ea5004a7295d3613ec572aae2e8908ce2e1f1741b"} Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.325610 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mlvx" event={"ID":"06df1df6-9fbb-40ed-a746-f90f10d5286f","Type":"ContainerStarted","Data":"d15f2d0777542ea02e91b3c6c3a21e171c7d6701a27fee9ceea650bcda6dba31"} Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.327863 4906 generic.go:334] "Generic (PLEG): container finished" podID="2f576461-ecf4-46ad-b7a0-ada0ee468ef0" containerID="d7a8d416886443e6cfcbb485124009ea6d81106ed684deebf5c2f8373dfaaa9f" exitCode=0 Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.327910 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2f576461-ecf4-46ad-b7a0-ada0ee468ef0","Type":"ContainerDied","Data":"d7a8d416886443e6cfcbb485124009ea6d81106ed684deebf5c2f8373dfaaa9f"} Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.328817 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpk7r"] Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.332283 4906 generic.go:334] "Generic (PLEG): container finished" podID="8f502088-4131-4a3a-ae5c-d4beff0db583" containerID="08249a07f184836b06649870b5c274d9494d54c3178ab0059b1bf941660f1af7" exitCode=0 Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.334276 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f502088-4131-4a3a-ae5c-d4beff0db583","Type":"ContainerDied","Data":"08249a07f184836b06649870b5c274d9494d54c3178ab0059b1bf941660f1af7"} Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.338375 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6578b" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.369086 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2cm\" (UniqueName: \"kubernetes.io/projected/e2cdeb59-eeb8-4641-921e-05a840fd96fa-kube-api-access-fb2cm\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.369144 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-catalog-content\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.369175 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-utilities\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.369646 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-utilities\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.370050 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-catalog-content\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.416645 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2cm\" (UniqueName: \"kubernetes.io/projected/e2cdeb59-eeb8-4641-921e-05a840fd96fa-kube-api-access-fb2cm\") pod \"redhat-operators-bpdlr\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.484559 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.561345 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bhglw"] Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.562764 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.563572 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhglw"] Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.677279 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-utilities\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.677334 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6254f\" (UniqueName: \"kubernetes.io/projected/efb5d164-4d17-4181-b6a8-765003ec5f57-kube-api-access-6254f\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.677401 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-catalog-content\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.779378 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-utilities\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.779438 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6254f\" (UniqueName: \"kubernetes.io/projected/efb5d164-4d17-4181-b6a8-765003ec5f57-kube-api-access-6254f\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.779508 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-catalog-content\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.780112 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-catalog-content\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.780395 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-utilities\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.816460 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6254f\" (UniqueName: \"kubernetes.io/projected/efb5d164-4d17-4181-b6a8-765003ec5f57-kube-api-access-6254f\") pod \"redhat-operators-bhglw\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.883742 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.926569 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpdlr"] Feb 21 00:10:17 crc kubenswrapper[4906]: W0221 00:10:17.951872 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2cdeb59_eeb8_4641_921e_05a840fd96fa.slice/crio-f1d09f215e5e4cc771d74cc35b06f8bfd5d741c1fcf9f2ce3f2492494d028bd1 WatchSource:0}: Error finding container f1d09f215e5e4cc771d74cc35b06f8bfd5d741c1fcf9f2ce3f2492494d028bd1: Status 404 returned error can't find the container with id f1d09f215e5e4cc771d74cc35b06f8bfd5d741c1fcf9f2ce3f2492494d028bd1 Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.961340 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:17 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:17 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:17 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:17 crc kubenswrapper[4906]: I0221 00:10:17.961408 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.347344 4906 generic.go:334] "Generic (PLEG): container finished" podID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerID="34f7392d91469153a0a564d2ca3729fcb3e96e04bcd10881fed0721d8a57a73f" exitCode=0 Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.347500 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpdlr" event={"ID":"e2cdeb59-eeb8-4641-921e-05a840fd96fa","Type":"ContainerDied","Data":"34f7392d91469153a0a564d2ca3729fcb3e96e04bcd10881fed0721d8a57a73f"} Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.347802 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpdlr" event={"ID":"e2cdeb59-eeb8-4641-921e-05a840fd96fa","Type":"ContainerStarted","Data":"f1d09f215e5e4cc771d74cc35b06f8bfd5d741c1fcf9f2ce3f2492494d028bd1"} Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.350808 4906 generic.go:334] "Generic (PLEG): container finished" podID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerID="28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad" exitCode=0 Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.351009 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpk7r" event={"ID":"92ef311b-1a77-4d80-be38-637a6aa7de11","Type":"ContainerDied","Data":"28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad"} Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.351036 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpk7r" event={"ID":"92ef311b-1a77-4d80-be38-637a6aa7de11","Type":"ContainerStarted","Data":"f4e8c6590bb14669f8fb31fc1a7c6927be9861bed902e62e522e3f61bc03e342"} Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.427662 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhglw"] Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.629439 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.714802 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.800602 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kubelet-dir\") pod \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\" (UID: \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\") " Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.800758 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2f576461-ecf4-46ad-b7a0-ada0ee468ef0" (UID: "2f576461-ecf4-46ad-b7a0-ada0ee468ef0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.800679 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f502088-4131-4a3a-ae5c-d4beff0db583-kubelet-dir\") pod \"8f502088-4131-4a3a-ae5c-d4beff0db583\" (UID: \"8f502088-4131-4a3a-ae5c-d4beff0db583\") " Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.800837 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kube-api-access\") pod \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\" (UID: \"2f576461-ecf4-46ad-b7a0-ada0ee468ef0\") " Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.800892 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f502088-4131-4a3a-ae5c-d4beff0db583-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8f502088-4131-4a3a-ae5c-d4beff0db583" (UID: "8f502088-4131-4a3a-ae5c-d4beff0db583"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.800908 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f502088-4131-4a3a-ae5c-d4beff0db583-kube-api-access\") pod \"8f502088-4131-4a3a-ae5c-d4beff0db583\" (UID: \"8f502088-4131-4a3a-ae5c-d4beff0db583\") " Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.801373 4906 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f502088-4131-4a3a-ae5c-d4beff0db583-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.801393 4906 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.806329 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f502088-4131-4a3a-ae5c-d4beff0db583-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8f502088-4131-4a3a-ae5c-d4beff0db583" (UID: "8f502088-4131-4a3a-ae5c-d4beff0db583"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.806366 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2f576461-ecf4-46ad-b7a0-ada0ee468ef0" (UID: "2f576461-ecf4-46ad-b7a0-ada0ee468ef0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.902251 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f576461-ecf4-46ad-b7a0-ada0ee468ef0-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.902288 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f502088-4131-4a3a-ae5c-d4beff0db583-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.962883 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:18 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:18 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:18 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:18 crc kubenswrapper[4906]: I0221 00:10:18.962964 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.361249 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8f502088-4131-4a3a-ae5c-d4beff0db583","Type":"ContainerDied","Data":"3e481328b82fe6a1615a82590938e92a077f852023f4bfddfb74783e48f52616"} Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.361289 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.361298 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e481328b82fe6a1615a82590938e92a077f852023f4bfddfb74783e48f52616" Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.379082 4906 generic.go:334] "Generic (PLEG): container finished" podID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerID="8d9c2e8c54ffd7a6eb9a3995875d8522ba83ecd0a6f6c6ac9f2bf30eb6206ae0" exitCode=0 Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.379217 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhglw" event={"ID":"efb5d164-4d17-4181-b6a8-765003ec5f57","Type":"ContainerDied","Data":"8d9c2e8c54ffd7a6eb9a3995875d8522ba83ecd0a6f6c6ac9f2bf30eb6206ae0"} Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.379255 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhglw" event={"ID":"efb5d164-4d17-4181-b6a8-765003ec5f57","Type":"ContainerStarted","Data":"b6568df4504fb57b7a444febe063e64a4430e26eb074ebdb6efdae3a6a0a6e1a"} Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.383474 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2f576461-ecf4-46ad-b7a0-ada0ee468ef0","Type":"ContainerDied","Data":"0a4856951474d23ed0dd8e758fc86fb20813937c47a560304caec8cc94188972"} Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.383507 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4856951474d23ed0dd8e758fc86fb20813937c47a560304caec8cc94188972" Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.383590 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.969005 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:19 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:19 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:19 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:19 crc kubenswrapper[4906]: I0221 00:10:19.969058 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:20 crc kubenswrapper[4906]: I0221 00:10:20.963242 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:20 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:20 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:20 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:20 crc kubenswrapper[4906]: I0221 00:10:20.963325 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:21 crc kubenswrapper[4906]: I0221 00:10:21.960715 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:21 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:21 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:21 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:21 crc kubenswrapper[4906]: I0221 00:10:21.960801 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:22 crc kubenswrapper[4906]: I0221 00:10:22.371659 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gqmhh" Feb 21 00:10:22 crc kubenswrapper[4906]: I0221 00:10:22.961261 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:22 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:22 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:22 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:22 crc kubenswrapper[4906]: I0221 00:10:22.961317 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:23 crc kubenswrapper[4906]: I0221 00:10:23.961059 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:23 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Feb 21 00:10:23 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:23 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:23 crc kubenswrapper[4906]: I0221 00:10:23.961122 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:24 crc kubenswrapper[4906]: I0221 00:10:24.960189 4906 patch_prober.go:28] interesting pod/router-default-5444994796-bzr2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 00:10:24 crc kubenswrapper[4906]: [+]has-synced ok Feb 21 00:10:24 crc kubenswrapper[4906]: [+]process-running ok Feb 21 00:10:24 crc kubenswrapper[4906]: healthz check failed Feb 21 00:10:24 crc kubenswrapper[4906]: I0221 00:10:24.960555 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bzr2z" podUID="ead04ba4-ffa8-4bf8-ae26-c9014dfda96f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 00:10:25 crc kubenswrapper[4906]: I0221 00:10:25.961558 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:25 crc kubenswrapper[4906]: I0221 00:10:25.968168 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-bzr2z" Feb 21 00:10:26 crc kubenswrapper[4906]: I0221 00:10:26.230709 4906 patch_prober.go:28] interesting pod/console-f9d7485db-86b7j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 21 00:10:26 crc kubenswrapper[4906]: I0221 00:10:26.231035 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-86b7j" podUID="beecb71f-3791-44c8-bee4-83585ee82c14" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 21 00:10:26 crc kubenswrapper[4906]: I0221 00:10:26.562262 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-84vm6" Feb 21 00:10:29 crc kubenswrapper[4906]: I0221 00:10:29.784195 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:10:29 crc kubenswrapper[4906]: I0221 00:10:29.789867 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7544a92e-993a-46af-9f26-243f53d1206d-metrics-certs\") pod \"network-metrics-daemon-rhw7p\" (UID: \"7544a92e-993a-46af-9f26-243f53d1206d\") " pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:10:30 crc kubenswrapper[4906]: I0221 00:10:30.087473 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhw7p" Feb 21 00:10:34 crc kubenswrapper[4906]: I0221 00:10:34.069335 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:10:36 crc kubenswrapper[4906]: I0221 00:10:36.235568 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:36 crc kubenswrapper[4906]: I0221 00:10:36.239958 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-86b7j" Feb 21 00:10:43 crc kubenswrapper[4906]: I0221 00:10:43.123888 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:10:43 crc kubenswrapper[4906]: I0221 00:10:43.124004 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:10:44 crc kubenswrapper[4906]: I0221 00:10:44.550952 4906 generic.go:334] "Generic (PLEG): container finished" podID="eb81731c-9eb7-4bf7-a263-a2117fabb5cc" containerID="86b63b2adfd59d33ed759387a0da097447cf23b057f3e85341c69aaeb752bf71" exitCode=0 Feb 21 00:10:44 crc kubenswrapper[4906]: I0221 00:10:44.551078 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29527200-5vtx4" event={"ID":"eb81731c-9eb7-4bf7-a263-a2117fabb5cc","Type":"ContainerDied","Data":"86b63b2adfd59d33ed759387a0da097447cf23b057f3e85341c69aaeb752bf71"} Feb 21 00:10:46 crc kubenswrapper[4906]: I0221 00:10:46.576631 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-667sz" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.585963 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.586466 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vfstm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gpk7r_openshift-marketplace(92ef311b-1a77-4d80-be38-637a6aa7de11): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.588289 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gpk7r" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.733096 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.733264 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bpsnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5mlvx_openshift-marketplace(06df1df6-9fbb-40ed-a746-f90f10d5286f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.734454 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5mlvx" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.755179 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.755337 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcnsh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zf44m_openshift-marketplace(0c851d93-c82d-4d73-b456-47c8fa6e3f5d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 21 00:10:47 crc kubenswrapper[4906]: E0221 00:10:47.756568 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zf44m" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.823364 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zf44m" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.823410 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gpk7r" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.823440 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5mlvx" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" Feb 21 00:10:50 crc kubenswrapper[4906]: I0221 00:10:50.899713 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.938400 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.938657 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q9flm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ptwns_openshift-marketplace(2ac4beba-19e0-4278-9701-d830f33d6688): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.942521 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ptwns" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.984324 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.984468 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fb2cm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bpdlr_openshift-marketplace(e2cdeb59-eeb8-4641-921e-05a840fd96fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 21 00:10:50 crc kubenswrapper[4906]: E0221 00:10:50.985633 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bpdlr" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.083329 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv8f9\" (UniqueName: \"kubernetes.io/projected/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-kube-api-access-jv8f9\") pod \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.083405 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca\") pod \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\" (UID: \"eb81731c-9eb7-4bf7-a263-a2117fabb5cc\") " Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.084754 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca" (OuterVolumeSpecName: "serviceca") pod "eb81731c-9eb7-4bf7-a263-a2117fabb5cc" (UID: "eb81731c-9eb7-4bf7-a263-a2117fabb5cc"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.096491 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-kube-api-access-jv8f9" (OuterVolumeSpecName: "kube-api-access-jv8f9") pod "eb81731c-9eb7-4bf7-a263-a2117fabb5cc" (UID: "eb81731c-9eb7-4bf7-a263-a2117fabb5cc"). InnerVolumeSpecName "kube-api-access-jv8f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.184883 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv8f9\" (UniqueName: \"kubernetes.io/projected/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-kube-api-access-jv8f9\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.185343 4906 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eb81731c-9eb7-4bf7-a263-a2117fabb5cc-serviceca\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.254842 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhglw" event={"ID":"efb5d164-4d17-4181-b6a8-765003ec5f57","Type":"ContainerStarted","Data":"f70d91513363dab29d9aea95f466f5931a7fe5505457e3db3fc03a02cc3137ab"} Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.257320 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbrvn" event={"ID":"73edbe76-4cd4-45e2-903c-8329ddfa9f58","Type":"ContainerStarted","Data":"788aaebebcfe9de8be190f1557a46dffe191b83c9f4f2f14e363d181c8f09bbd"} Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.260345 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29527200-5vtx4" event={"ID":"eb81731c-9eb7-4bf7-a263-a2117fabb5cc","Type":"ContainerDied","Data":"1806e6738408a8f1264606ff0f51915018a4f262d32476698e91ee71b4f4d35d"} Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.260639 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1806e6738408a8f1264606ff0f51915018a4f262d32476698e91ee71b4f4d35d" Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.260409 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29527200-5vtx4" Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.262368 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8xm9" event={"ID":"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622","Type":"ContainerStarted","Data":"7bddb1006ea71471e455bc38a151b1a8cc0a40bb8eacae4855972f4de91525b7"} Feb 21 00:10:51 crc kubenswrapper[4906]: E0221 00:10:51.265107 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bpdlr" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" Feb 21 00:10:51 crc kubenswrapper[4906]: E0221 00:10:51.266008 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ptwns" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" Feb 21 00:10:51 crc kubenswrapper[4906]: I0221 00:10:51.309761 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rhw7p"] Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.268531 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" event={"ID":"7544a92e-993a-46af-9f26-243f53d1206d","Type":"ContainerStarted","Data":"cacd6becabd8b1db2dc591fa524926ea41eb24a75e90add4695c598cc65290f7"} Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.269795 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" event={"ID":"7544a92e-993a-46af-9f26-243f53d1206d","Type":"ContainerStarted","Data":"6fd5a2fe730da479434cbf6ba7565e9e7f00096dd755828964a6cc4d1bd0a19d"} Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.269890 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rhw7p" event={"ID":"7544a92e-993a-46af-9f26-243f53d1206d","Type":"ContainerStarted","Data":"3626e7fe8d228d8384959aeb308433de6f2b8ec84d9249e3f354b5998a0c63e1"} Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.274175 4906 generic.go:334] "Generic (PLEG): container finished" podID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerID="f70d91513363dab29d9aea95f466f5931a7fe5505457e3db3fc03a02cc3137ab" exitCode=0 Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.274322 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhglw" event={"ID":"efb5d164-4d17-4181-b6a8-765003ec5f57","Type":"ContainerDied","Data":"f70d91513363dab29d9aea95f466f5931a7fe5505457e3db3fc03a02cc3137ab"} Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.278402 4906 generic.go:334] "Generic (PLEG): container finished" podID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerID="788aaebebcfe9de8be190f1557a46dffe191b83c9f4f2f14e363d181c8f09bbd" exitCode=0 Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.278488 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbrvn" event={"ID":"73edbe76-4cd4-45e2-903c-8329ddfa9f58","Type":"ContainerDied","Data":"788aaebebcfe9de8be190f1557a46dffe191b83c9f4f2f14e363d181c8f09bbd"} Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.285793 4906 generic.go:334] "Generic (PLEG): container finished" podID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerID="7bddb1006ea71471e455bc38a151b1a8cc0a40bb8eacae4855972f4de91525b7" exitCode=0 Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.285884 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8xm9" event={"ID":"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622","Type":"ContainerDied","Data":"7bddb1006ea71471e455bc38a151b1a8cc0a40bb8eacae4855972f4de91525b7"} Feb 21 00:10:52 crc kubenswrapper[4906]: I0221 00:10:52.295676 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rhw7p" podStartSLOduration=167.295648405 podStartE2EDuration="2m47.295648405s" podCreationTimestamp="2026-02-21 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:10:52.29514205 +0000 UTC m=+187.546729576" watchObservedRunningTime="2026-02-21 00:10:52.295648405 +0000 UTC m=+187.547235911" Feb 21 00:10:53 crc kubenswrapper[4906]: I0221 00:10:53.295046 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbrvn" event={"ID":"73edbe76-4cd4-45e2-903c-8329ddfa9f58","Type":"ContainerStarted","Data":"2c9d5d6dafa68f34552253beedd93b0f829fe32732da3746a736fa19f2314f8f"} Feb 21 00:10:53 crc kubenswrapper[4906]: I0221 00:10:53.297615 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhglw" event={"ID":"efb5d164-4d17-4181-b6a8-765003ec5f57","Type":"ContainerStarted","Data":"0495a42ba8262a96e2320e92b87885f5113e6de9b26dad20f4e8a917fcd21bda"} Feb 21 00:10:53 crc kubenswrapper[4906]: I0221 00:10:53.313304 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gbrvn" podStartSLOduration=1.5930107310000001 podStartE2EDuration="39.313283662s" podCreationTimestamp="2026-02-21 00:10:14 +0000 UTC" firstStartedPulling="2026-02-21 00:10:15.286138447 +0000 UTC m=+150.537725953" lastFinishedPulling="2026-02-21 00:10:53.006411368 +0000 UTC m=+188.257998884" observedRunningTime="2026-02-21 00:10:53.310841802 +0000 UTC m=+188.562429338" watchObservedRunningTime="2026-02-21 00:10:53.313283662 +0000 UTC m=+188.564871208" Feb 21 00:10:53 crc kubenswrapper[4906]: I0221 00:10:53.334508 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bhglw" podStartSLOduration=2.6320869350000002 podStartE2EDuration="36.334487948s" podCreationTimestamp="2026-02-21 00:10:17 +0000 UTC" firstStartedPulling="2026-02-21 00:10:19.383744424 +0000 UTC m=+154.635331930" lastFinishedPulling="2026-02-21 00:10:53.086145437 +0000 UTC m=+188.337732943" observedRunningTime="2026-02-21 00:10:53.329516516 +0000 UTC m=+188.581104042" watchObservedRunningTime="2026-02-21 00:10:53.334487948 +0000 UTC m=+188.586075454" Feb 21 00:10:53 crc kubenswrapper[4906]: I0221 00:10:53.863802 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 00:10:54 crc kubenswrapper[4906]: I0221 00:10:54.305994 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8xm9" event={"ID":"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622","Type":"ContainerStarted","Data":"03d79fa6e580e48375be3589dab10bc8d56ad852fadd9aabf37d3c7b4d517677"} Feb 21 00:10:54 crc kubenswrapper[4906]: I0221 00:10:54.327945 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m8xm9" podStartSLOduration=3.395053064 podStartE2EDuration="40.327923122s" podCreationTimestamp="2026-02-21 00:10:14 +0000 UTC" firstStartedPulling="2026-02-21 00:10:16.309188167 +0000 UTC m=+151.560775673" lastFinishedPulling="2026-02-21 00:10:53.242058225 +0000 UTC m=+188.493645731" observedRunningTime="2026-02-21 00:10:54.323716262 +0000 UTC m=+189.575303768" watchObservedRunningTime="2026-02-21 00:10:54.327923122 +0000 UTC m=+189.579510628" Feb 21 00:10:54 crc kubenswrapper[4906]: I0221 00:10:54.674614 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:54 crc kubenswrapper[4906]: I0221 00:10:54.674664 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:10:54 crc kubenswrapper[4906]: I0221 00:10:54.855466 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9kg2j"] Feb 21 00:10:54 crc kubenswrapper[4906]: I0221 00:10:54.872107 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:54 crc kubenswrapper[4906]: I0221 00:10:54.872193 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.848047 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 21 00:10:55 crc kubenswrapper[4906]: E0221 00:10:55.848877 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f576461-ecf4-46ad-b7a0-ada0ee468ef0" containerName="pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.848897 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f576461-ecf4-46ad-b7a0-ada0ee468ef0" containerName="pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: E0221 00:10:55.848907 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f502088-4131-4a3a-ae5c-d4beff0db583" containerName="pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.848916 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f502088-4131-4a3a-ae5c-d4beff0db583" containerName="pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: E0221 00:10:55.848929 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb81731c-9eb7-4bf7-a263-a2117fabb5cc" containerName="image-pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.848938 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb81731c-9eb7-4bf7-a263-a2117fabb5cc" containerName="image-pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.849119 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f502088-4131-4a3a-ae5c-d4beff0db583" containerName="pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.849131 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb81731c-9eb7-4bf7-a263-a2117fabb5cc" containerName="image-pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.849148 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f576461-ecf4-46ad-b7a0-ada0ee468ef0" containerName="pruner" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.849633 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.851254 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gbrvn" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="registry-server" probeResult="failure" output=< Feb 21 00:10:55 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Feb 21 00:10:55 crc kubenswrapper[4906]: > Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.852607 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.852836 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.863593 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 21 00:10:55 crc kubenswrapper[4906]: I0221 00:10:55.919165 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-m8xm9" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="registry-server" probeResult="failure" output=< Feb 21 00:10:55 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Feb 21 00:10:55 crc kubenswrapper[4906]: > Feb 21 00:10:56 crc kubenswrapper[4906]: I0221 00:10:56.001134 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:56 crc kubenswrapper[4906]: I0221 00:10:56.001195 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:56 crc kubenswrapper[4906]: I0221 00:10:56.102795 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:56 crc kubenswrapper[4906]: I0221 00:10:56.102865 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:56 crc kubenswrapper[4906]: I0221 00:10:56.103044 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:56 crc kubenswrapper[4906]: I0221 00:10:56.127356 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:56 crc kubenswrapper[4906]: I0221 00:10:56.176959 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:56 crc kubenswrapper[4906]: I0221 00:10:56.586601 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 21 00:10:56 crc kubenswrapper[4906]: W0221 00:10:56.595761 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode53150bf_3df1_4ec6_ad45_95b69c8ee444.slice/crio-7cd7fe941ea65a8e878064b24173b77575b6b40b2ba195ad3c66f0f9040377f1 WatchSource:0}: Error finding container 7cd7fe941ea65a8e878064b24173b77575b6b40b2ba195ad3c66f0f9040377f1: Status 404 returned error can't find the container with id 7cd7fe941ea65a8e878064b24173b77575b6b40b2ba195ad3c66f0f9040377f1 Feb 21 00:10:57 crc kubenswrapper[4906]: I0221 00:10:57.322018 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e53150bf-3df1-4ec6-ad45-95b69c8ee444","Type":"ContainerStarted","Data":"7cd7fe941ea65a8e878064b24173b77575b6b40b2ba195ad3c66f0f9040377f1"} Feb 21 00:10:57 crc kubenswrapper[4906]: I0221 00:10:57.884863 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:57 crc kubenswrapper[4906]: I0221 00:10:57.884910 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:10:58 crc kubenswrapper[4906]: I0221 00:10:58.334935 4906 generic.go:334] "Generic (PLEG): container finished" podID="e53150bf-3df1-4ec6-ad45-95b69c8ee444" containerID="1c4c67f7979cdee2c7ce5cea8deec229acd87ec0ff6f92fb8ea31dddf89a52d5" exitCode=0 Feb 21 00:10:58 crc kubenswrapper[4906]: I0221 00:10:58.335249 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e53150bf-3df1-4ec6-ad45-95b69c8ee444","Type":"ContainerDied","Data":"1c4c67f7979cdee2c7ce5cea8deec229acd87ec0ff6f92fb8ea31dddf89a52d5"} Feb 21 00:10:58 crc kubenswrapper[4906]: I0221 00:10:58.928409 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bhglw" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="registry-server" probeResult="failure" output=< Feb 21 00:10:58 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Feb 21 00:10:58 crc kubenswrapper[4906]: > Feb 21 00:10:59 crc kubenswrapper[4906]: I0221 00:10:59.581960 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:10:59 crc kubenswrapper[4906]: I0221 00:10:59.648050 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kube-api-access\") pod \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\" (UID: \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\") " Feb 21 00:10:59 crc kubenswrapper[4906]: I0221 00:10:59.648111 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kubelet-dir\") pod \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\" (UID: \"e53150bf-3df1-4ec6-ad45-95b69c8ee444\") " Feb 21 00:10:59 crc kubenswrapper[4906]: I0221 00:10:59.648429 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e53150bf-3df1-4ec6-ad45-95b69c8ee444" (UID: "e53150bf-3df1-4ec6-ad45-95b69c8ee444"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:10:59 crc kubenswrapper[4906]: I0221 00:10:59.653422 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e53150bf-3df1-4ec6-ad45-95b69c8ee444" (UID: "e53150bf-3df1-4ec6-ad45-95b69c8ee444"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:10:59 crc kubenswrapper[4906]: I0221 00:10:59.749568 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 00:10:59 crc kubenswrapper[4906]: I0221 00:10:59.749619 4906 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e53150bf-3df1-4ec6-ad45-95b69c8ee444-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:00 crc kubenswrapper[4906]: I0221 00:11:00.347990 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e53150bf-3df1-4ec6-ad45-95b69c8ee444","Type":"ContainerDied","Data":"7cd7fe941ea65a8e878064b24173b77575b6b40b2ba195ad3c66f0f9040377f1"} Feb 21 00:11:00 crc kubenswrapper[4906]: I0221 00:11:00.348029 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cd7fe941ea65a8e878064b24173b77575b6b40b2ba195ad3c66f0f9040377f1" Feb 21 00:11:00 crc kubenswrapper[4906]: I0221 00:11:00.348045 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.840484 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 21 00:11:03 crc kubenswrapper[4906]: E0221 00:11:03.841064 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53150bf-3df1-4ec6-ad45-95b69c8ee444" containerName="pruner" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.841081 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53150bf-3df1-4ec6-ad45-95b69c8ee444" containerName="pruner" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.842479 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53150bf-3df1-4ec6-ad45-95b69c8ee444" containerName="pruner" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.843344 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.847119 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.847662 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.869638 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.903078 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-kubelet-dir\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.903136 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-var-lock\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:03 crc kubenswrapper[4906]: I0221 00:11:03.903163 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03720e31-8028-44b3-9bc7-54cb0474a821-kube-api-access\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.004857 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-kubelet-dir\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.004916 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-var-lock\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.004948 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03720e31-8028-44b3-9bc7-54cb0474a821-kube-api-access\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.005049 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-var-lock\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.005049 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-kubelet-dir\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.023075 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03720e31-8028-44b3-9bc7-54cb0474a821-kube-api-access\") pod \"installer-9-crc\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.191724 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.573795 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 21 00:11:04 crc kubenswrapper[4906]: W0221 00:11:04.591780 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod03720e31_8028_44b3_9bc7_54cb0474a821.slice/crio-a62d0e6f2fe718e4b46b29a6528da5ce2d812c383085b4f94c4959a6fe7a7e7f WatchSource:0}: Error finding container a62d0e6f2fe718e4b46b29a6528da5ce2d812c383085b4f94c4959a6fe7a7e7f: Status 404 returned error can't find the container with id a62d0e6f2fe718e4b46b29a6528da5ce2d812c383085b4f94c4959a6fe7a7e7f Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.729882 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.790412 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.913307 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.959974 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gbrvn"] Feb 21 00:11:04 crc kubenswrapper[4906]: I0221 00:11:04.963096 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:11:05 crc kubenswrapper[4906]: I0221 00:11:05.378975 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"03720e31-8028-44b3-9bc7-54cb0474a821","Type":"ContainerStarted","Data":"70db0738c93dba9222aeb95ece51e5d21a3869ce63dead82a6cba5dad27d36d1"} Feb 21 00:11:05 crc kubenswrapper[4906]: I0221 00:11:05.379241 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"03720e31-8028-44b3-9bc7-54cb0474a821","Type":"ContainerStarted","Data":"a62d0e6f2fe718e4b46b29a6528da5ce2d812c383085b4f94c4959a6fe7a7e7f"} Feb 21 00:11:06 crc kubenswrapper[4906]: I0221 00:11:06.383561 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gbrvn" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="registry-server" containerID="cri-o://2c9d5d6dafa68f34552253beedd93b0f829fe32732da3746a736fa19f2314f8f" gracePeriod=2 Feb 21 00:11:06 crc kubenswrapper[4906]: I0221 00:11:06.401333 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.401315366 podStartE2EDuration="3.401315366s" podCreationTimestamp="2026-02-21 00:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:11:06.399468263 +0000 UTC m=+201.651055769" watchObservedRunningTime="2026-02-21 00:11:06.401315366 +0000 UTC m=+201.652902872" Feb 21 00:11:07 crc kubenswrapper[4906]: I0221 00:11:07.159528 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m8xm9"] Feb 21 00:11:07 crc kubenswrapper[4906]: I0221 00:11:07.159769 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m8xm9" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="registry-server" containerID="cri-o://03d79fa6e580e48375be3589dab10bc8d56ad852fadd9aabf37d3c7b4d517677" gracePeriod=2 Feb 21 00:11:07 crc kubenswrapper[4906]: E0221 00:11:07.381036 4906 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69b5bb2_d2af_4aa6_a35e_9c96b3f78622.slice/crio-03d79fa6e580e48375be3589dab10bc8d56ad852fadd9aabf37d3c7b4d517677.scope\": RecentStats: unable to find data in memory cache]" Feb 21 00:11:07 crc kubenswrapper[4906]: I0221 00:11:07.391599 4906 generic.go:334] "Generic (PLEG): container finished" podID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerID="2c9d5d6dafa68f34552253beedd93b0f829fe32732da3746a736fa19f2314f8f" exitCode=0 Feb 21 00:11:07 crc kubenswrapper[4906]: I0221 00:11:07.391645 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbrvn" event={"ID":"73edbe76-4cd4-45e2-903c-8329ddfa9f58","Type":"ContainerDied","Data":"2c9d5d6dafa68f34552253beedd93b0f829fe32732da3746a736fa19f2314f8f"} Feb 21 00:11:07 crc kubenswrapper[4906]: I0221 00:11:07.926286 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:11:07 crc kubenswrapper[4906]: I0221 00:11:07.970092 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:11:08 crc kubenswrapper[4906]: I0221 00:11:08.403427 4906 generic.go:334] "Generic (PLEG): container finished" podID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerID="03d79fa6e580e48375be3589dab10bc8d56ad852fadd9aabf37d3c7b4d517677" exitCode=0 Feb 21 00:11:08 crc kubenswrapper[4906]: I0221 00:11:08.403510 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8xm9" event={"ID":"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622","Type":"ContainerDied","Data":"03d79fa6e580e48375be3589dab10bc8d56ad852fadd9aabf37d3c7b4d517677"} Feb 21 00:11:08 crc kubenswrapper[4906]: I0221 00:11:08.406738 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpdlr" event={"ID":"e2cdeb59-eeb8-4641-921e-05a840fd96fa","Type":"ContainerStarted","Data":"44f8ac25733dd0e112cf8018d266ce62b77b4a5ab875a018e9a5ddf2ec81c413"} Feb 21 00:11:09 crc kubenswrapper[4906]: I0221 00:11:09.412669 4906 generic.go:334] "Generic (PLEG): container finished" podID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerID="44f8ac25733dd0e112cf8018d266ce62b77b4a5ab875a018e9a5ddf2ec81c413" exitCode=0 Feb 21 00:11:09 crc kubenswrapper[4906]: I0221 00:11:09.412707 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpdlr" event={"ID":"e2cdeb59-eeb8-4641-921e-05a840fd96fa","Type":"ContainerDied","Data":"44f8ac25733dd0e112cf8018d266ce62b77b4a5ab875a018e9a5ddf2ec81c413"} Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.425908 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbrvn" event={"ID":"73edbe76-4cd4-45e2-903c-8329ddfa9f58","Type":"ContainerDied","Data":"d2314e5c868365760402b8a8885f9c20460cb0f6de91ff8a93d0e8757733f445"} Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.425948 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2314e5c868365760402b8a8885f9c20460cb0f6de91ff8a93d0e8757733f445" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.432736 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8xm9" event={"ID":"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622","Type":"ContainerDied","Data":"7d56d8e1f775ca54b5f7b567c49248398dc62ecfa0afb4e2dfd1c3db6560cecc"} Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.432775 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d56d8e1f775ca54b5f7b567c49248398dc62ecfa0afb4e2dfd1c3db6560cecc" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.474009 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.479905 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.599819 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-utilities\") pod \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.599987 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd8vj\" (UniqueName: \"kubernetes.io/projected/73edbe76-4cd4-45e2-903c-8329ddfa9f58-kube-api-access-wd8vj\") pod \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.600063 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpwgs\" (UniqueName: \"kubernetes.io/projected/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-kube-api-access-kpwgs\") pod \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.600109 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-catalog-content\") pod \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.600162 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-catalog-content\") pod \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\" (UID: \"73edbe76-4cd4-45e2-903c-8329ddfa9f58\") " Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.600255 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-utilities\") pod \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\" (UID: \"c69b5bb2-d2af-4aa6-a35e-9c96b3f78622\") " Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.600865 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-utilities" (OuterVolumeSpecName: "utilities") pod "73edbe76-4cd4-45e2-903c-8329ddfa9f58" (UID: "73edbe76-4cd4-45e2-903c-8329ddfa9f58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.601476 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-utilities" (OuterVolumeSpecName: "utilities") pod "c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" (UID: "c69b5bb2-d2af-4aa6-a35e-9c96b3f78622"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.605103 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73edbe76-4cd4-45e2-903c-8329ddfa9f58-kube-api-access-wd8vj" (OuterVolumeSpecName: "kube-api-access-wd8vj") pod "73edbe76-4cd4-45e2-903c-8329ddfa9f58" (UID: "73edbe76-4cd4-45e2-903c-8329ddfa9f58"). InnerVolumeSpecName "kube-api-access-wd8vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.605263 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-kube-api-access-kpwgs" (OuterVolumeSpecName: "kube-api-access-kpwgs") pod "c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" (UID: "c69b5bb2-d2af-4aa6-a35e-9c96b3f78622"). InnerVolumeSpecName "kube-api-access-kpwgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.660484 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73edbe76-4cd4-45e2-903c-8329ddfa9f58" (UID: "73edbe76-4cd4-45e2-903c-8329ddfa9f58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.683205 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" (UID: "c69b5bb2-d2af-4aa6-a35e-9c96b3f78622"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.701835 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.701872 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.701890 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd8vj\" (UniqueName: \"kubernetes.io/projected/73edbe76-4cd4-45e2-903c-8329ddfa9f58-kube-api-access-wd8vj\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.701905 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpwgs\" (UniqueName: \"kubernetes.io/projected/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-kube-api-access-kpwgs\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.701918 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:10 crc kubenswrapper[4906]: I0221 00:11:10.701929 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73edbe76-4cd4-45e2-903c-8329ddfa9f58-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:11 crc kubenswrapper[4906]: I0221 00:11:11.438472 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m8xm9" Feb 21 00:11:11 crc kubenswrapper[4906]: I0221 00:11:11.438747 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mlvx" event={"ID":"06df1df6-9fbb-40ed-a746-f90f10d5286f","Type":"ContainerStarted","Data":"118efcbdcfc1710531b17df1d6a1ca120bceee20bc949f8d3a5bab010d22e928"} Feb 21 00:11:11 crc kubenswrapper[4906]: I0221 00:11:11.439487 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbrvn" Feb 21 00:11:11 crc kubenswrapper[4906]: I0221 00:11:11.607422 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m8xm9"] Feb 21 00:11:11 crc kubenswrapper[4906]: I0221 00:11:11.612526 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m8xm9"] Feb 21 00:11:11 crc kubenswrapper[4906]: I0221 00:11:11.628740 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gbrvn"] Feb 21 00:11:11 crc kubenswrapper[4906]: I0221 00:11:11.632838 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gbrvn"] Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.166596 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bhglw"] Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.167210 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bhglw" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="registry-server" containerID="cri-o://0495a42ba8262a96e2320e92b87885f5113e6de9b26dad20f4e8a917fcd21bda" gracePeriod=2 Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.444277 4906 generic.go:334] "Generic (PLEG): container finished" podID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerID="0495a42ba8262a96e2320e92b87885f5113e6de9b26dad20f4e8a917fcd21bda" exitCode=0 Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.445077 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhglw" event={"ID":"efb5d164-4d17-4181-b6a8-765003ec5f57","Type":"ContainerDied","Data":"0495a42ba8262a96e2320e92b87885f5113e6de9b26dad20f4e8a917fcd21bda"} Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.446478 4906 generic.go:334] "Generic (PLEG): container finished" podID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerID="4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd" exitCode=0 Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.446613 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpk7r" event={"ID":"92ef311b-1a77-4d80-be38-637a6aa7de11","Type":"ContainerDied","Data":"4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd"} Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.449905 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpdlr" event={"ID":"e2cdeb59-eeb8-4641-921e-05a840fd96fa","Type":"ContainerStarted","Data":"c9ca58e8f0041907d9d30a883af56b57fe0f3e7a2f42f1534e698c32d0526b13"} Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.453774 4906 generic.go:334] "Generic (PLEG): container finished" podID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerID="118efcbdcfc1710531b17df1d6a1ca120bceee20bc949f8d3a5bab010d22e928" exitCode=0 Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.453825 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mlvx" event={"ID":"06df1df6-9fbb-40ed-a746-f90f10d5286f","Type":"ContainerDied","Data":"118efcbdcfc1710531b17df1d6a1ca120bceee20bc949f8d3a5bab010d22e928"} Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.453968 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mlvx" event={"ID":"06df1df6-9fbb-40ed-a746-f90f10d5286f","Type":"ContainerStarted","Data":"373b2f2b11d9dd1ff16703493e1fea0ab03355eb78f2b61b6697c4a3bc4eee9e"} Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.455921 4906 generic.go:334] "Generic (PLEG): container finished" podID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerID="55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f" exitCode=0 Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.455990 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf44m" event={"ID":"0c851d93-c82d-4d73-b456-47c8fa6e3f5d","Type":"ContainerDied","Data":"55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f"} Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.458492 4906 generic.go:334] "Generic (PLEG): container finished" podID="2ac4beba-19e0-4278-9701-d830f33d6688" containerID="eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547" exitCode=0 Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.458527 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptwns" event={"ID":"2ac4beba-19e0-4278-9701-d830f33d6688","Type":"ContainerDied","Data":"eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547"} Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.522536 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5mlvx" podStartSLOduration=1.752927226 podStartE2EDuration="56.522515044s" podCreationTimestamp="2026-02-21 00:10:16 +0000 UTC" firstStartedPulling="2026-02-21 00:10:17.328995795 +0000 UTC m=+152.580583301" lastFinishedPulling="2026-02-21 00:11:12.098583613 +0000 UTC m=+207.350171119" observedRunningTime="2026-02-21 00:11:12.520460516 +0000 UTC m=+207.772048042" watchObservedRunningTime="2026-02-21 00:11:12.522515044 +0000 UTC m=+207.774102570" Feb 21 00:11:12 crc kubenswrapper[4906]: I0221 00:11:12.541488 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bpdlr" podStartSLOduration=2.027034666 podStartE2EDuration="55.541470126s" podCreationTimestamp="2026-02-21 00:10:17 +0000 UTC" firstStartedPulling="2026-02-21 00:10:18.350293107 +0000 UTC m=+153.601880613" lastFinishedPulling="2026-02-21 00:11:11.864728567 +0000 UTC m=+207.116316073" observedRunningTime="2026-02-21 00:11:12.538051589 +0000 UTC m=+207.789639105" watchObservedRunningTime="2026-02-21 00:11:12.541470126 +0000 UTC m=+207.793057632" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.123981 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.124039 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.124091 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.125020 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81"} pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.125431 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" containerID="cri-o://c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81" gracePeriod=600 Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.168137 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.248378 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-utilities\") pod \"efb5d164-4d17-4181-b6a8-765003ec5f57\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.248437 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6254f\" (UniqueName: \"kubernetes.io/projected/efb5d164-4d17-4181-b6a8-765003ec5f57-kube-api-access-6254f\") pod \"efb5d164-4d17-4181-b6a8-765003ec5f57\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.248471 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-catalog-content\") pod \"efb5d164-4d17-4181-b6a8-765003ec5f57\" (UID: \"efb5d164-4d17-4181-b6a8-765003ec5f57\") " Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.250440 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-utilities" (OuterVolumeSpecName: "utilities") pod "efb5d164-4d17-4181-b6a8-765003ec5f57" (UID: "efb5d164-4d17-4181-b6a8-765003ec5f57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.255006 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb5d164-4d17-4181-b6a8-765003ec5f57-kube-api-access-6254f" (OuterVolumeSpecName: "kube-api-access-6254f") pod "efb5d164-4d17-4181-b6a8-765003ec5f57" (UID: "efb5d164-4d17-4181-b6a8-765003ec5f57"). InnerVolumeSpecName "kube-api-access-6254f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.349759 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6254f\" (UniqueName: \"kubernetes.io/projected/efb5d164-4d17-4181-b6a8-765003ec5f57-kube-api-access-6254f\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.349794 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.395825 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efb5d164-4d17-4181-b6a8-765003ec5f57" (UID: "efb5d164-4d17-4181-b6a8-765003ec5f57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.450842 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d164-4d17-4181-b6a8-765003ec5f57-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.466180 4906 generic.go:334] "Generic (PLEG): container finished" podID="17518505-fa81-4399-b6cd-5527dae35ef3" containerID="c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81" exitCode=0 Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.466273 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerDied","Data":"c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81"} Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.468718 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptwns" event={"ID":"2ac4beba-19e0-4278-9701-d830f33d6688","Type":"ContainerStarted","Data":"541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8"} Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.471333 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhglw" event={"ID":"efb5d164-4d17-4181-b6a8-765003ec5f57","Type":"ContainerDied","Data":"b6568df4504fb57b7a444febe063e64a4430e26eb074ebdb6efdae3a6a0a6e1a"} Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.471378 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhglw" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.471393 4906 scope.go:117] "RemoveContainer" containerID="0495a42ba8262a96e2320e92b87885f5113e6de9b26dad20f4e8a917fcd21bda" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.507189 4906 scope.go:117] "RemoveContainer" containerID="f70d91513363dab29d9aea95f466f5931a7fe5505457e3db3fc03a02cc3137ab" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.513270 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ptwns" podStartSLOduration=2.728065173 podStartE2EDuration="1m0.513249481s" podCreationTimestamp="2026-02-21 00:10:13 +0000 UTC" firstStartedPulling="2026-02-21 00:10:15.290163982 +0000 UTC m=+150.541751488" lastFinishedPulling="2026-02-21 00:11:13.0753483 +0000 UTC m=+208.326935796" observedRunningTime="2026-02-21 00:11:13.504894192 +0000 UTC m=+208.756481698" watchObservedRunningTime="2026-02-21 00:11:13.513249481 +0000 UTC m=+208.764836987" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.527116 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" path="/var/lib/kubelet/pods/73edbe76-4cd4-45e2-903c-8329ddfa9f58/volumes" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.527757 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" path="/var/lib/kubelet/pods/c69b5bb2-d2af-4aa6-a35e-9c96b3f78622/volumes" Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.534772 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bhglw"] Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.541187 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bhglw"] Feb 21 00:11:13 crc kubenswrapper[4906]: I0221 00:11:13.551076 4906 scope.go:117] "RemoveContainer" containerID="8d9c2e8c54ffd7a6eb9a3995875d8522ba83ecd0a6f6c6ac9f2bf30eb6206ae0" Feb 21 00:11:14 crc kubenswrapper[4906]: I0221 00:11:14.250792 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:11:14 crc kubenswrapper[4906]: I0221 00:11:14.251156 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:11:14 crc kubenswrapper[4906]: I0221 00:11:14.478852 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf44m" event={"ID":"0c851d93-c82d-4d73-b456-47c8fa6e3f5d","Type":"ContainerStarted","Data":"dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47"} Feb 21 00:11:14 crc kubenswrapper[4906]: I0221 00:11:14.483217 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"202df608ff6999e26cdc9a4d938cc29189ef1bc902ce31f6c1f085f6055345ad"} Feb 21 00:11:14 crc kubenswrapper[4906]: I0221 00:11:14.484896 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpk7r" event={"ID":"92ef311b-1a77-4d80-be38-637a6aa7de11","Type":"ContainerStarted","Data":"1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998"} Feb 21 00:11:14 crc kubenswrapper[4906]: I0221 00:11:14.504287 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zf44m" podStartSLOduration=2.568933735 podStartE2EDuration="1m0.504266967s" podCreationTimestamp="2026-02-21 00:10:14 +0000 UTC" firstStartedPulling="2026-02-21 00:10:15.299729675 +0000 UTC m=+150.551317181" lastFinishedPulling="2026-02-21 00:11:13.235062907 +0000 UTC m=+208.486650413" observedRunningTime="2026-02-21 00:11:14.500360845 +0000 UTC m=+209.751948351" watchObservedRunningTime="2026-02-21 00:11:14.504266967 +0000 UTC m=+209.755854473" Feb 21 00:11:14 crc kubenswrapper[4906]: I0221 00:11:14.536302 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gpk7r" podStartSLOduration=3.3909777119999998 podStartE2EDuration="58.536284292s" podCreationTimestamp="2026-02-21 00:10:16 +0000 UTC" firstStartedPulling="2026-02-21 00:10:18.352391526 +0000 UTC m=+153.603979022" lastFinishedPulling="2026-02-21 00:11:13.497698096 +0000 UTC m=+208.749285602" observedRunningTime="2026-02-21 00:11:14.532844284 +0000 UTC m=+209.784431790" watchObservedRunningTime="2026-02-21 00:11:14.536284292 +0000 UTC m=+209.787871798" Feb 21 00:11:15 crc kubenswrapper[4906]: I0221 00:11:15.287429 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ptwns" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="registry-server" probeResult="failure" output=< Feb 21 00:11:15 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Feb 21 00:11:15 crc kubenswrapper[4906]: > Feb 21 00:11:15 crc kubenswrapper[4906]: I0221 00:11:15.524018 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" path="/var/lib/kubelet/pods/efb5d164-4d17-4181-b6a8-765003ec5f57/volumes" Feb 21 00:11:16 crc kubenswrapper[4906]: I0221 00:11:16.449348 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:11:16 crc kubenswrapper[4906]: I0221 00:11:16.449466 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:11:16 crc kubenswrapper[4906]: I0221 00:11:16.494949 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:11:16 crc kubenswrapper[4906]: I0221 00:11:16.954974 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:11:16 crc kubenswrapper[4906]: I0221 00:11:16.955035 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:11:16 crc kubenswrapper[4906]: I0221 00:11:16.999097 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:11:17 crc kubenswrapper[4906]: I0221 00:11:17.485584 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:11:17 crc kubenswrapper[4906]: I0221 00:11:17.485647 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:11:17 crc kubenswrapper[4906]: I0221 00:11:17.541586 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:11:18 crc kubenswrapper[4906]: I0221 00:11:18.532826 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bpdlr" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="registry-server" probeResult="failure" output=< Feb 21 00:11:18 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Feb 21 00:11:18 crc kubenswrapper[4906]: > Feb 21 00:11:19 crc kubenswrapper[4906]: I0221 00:11:19.895716 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" podUID="e0f1e160-28c6-4ff5-8b24-8c962f120747" containerName="oauth-openshift" containerID="cri-o://20e4fbf06d46e679cf87d69a35a81277af7a8c3e978e9ad9b1a12951f78dade2" gracePeriod=15 Feb 21 00:11:20 crc kubenswrapper[4906]: I0221 00:11:20.526360 4906 generic.go:334] "Generic (PLEG): container finished" podID="e0f1e160-28c6-4ff5-8b24-8c962f120747" containerID="20e4fbf06d46e679cf87d69a35a81277af7a8c3e978e9ad9b1a12951f78dade2" exitCode=0 Feb 21 00:11:20 crc kubenswrapper[4906]: I0221 00:11:20.526405 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" event={"ID":"e0f1e160-28c6-4ff5-8b24-8c962f120747","Type":"ContainerDied","Data":"20e4fbf06d46e679cf87d69a35a81277af7a8c3e978e9ad9b1a12951f78dade2"} Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.233702 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.356841 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357229 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-session\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357270 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357303 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-provider-selection\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357361 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-router-certs\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357399 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfkqm\" (UniqueName: \"kubernetes.io/projected/e0f1e160-28c6-4ff5-8b24-8c962f120747-kube-api-access-dfkqm\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357448 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-ocp-branding-template\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357484 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-serving-cert\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357508 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-error\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357549 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-login\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357576 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-idp-0-file-data\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357605 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-dir\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357635 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357658 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig\") pod \"e0f1e160-28c6-4ff5-8b24-8c962f120747\" (UID: \"e0f1e160-28c6-4ff5-8b24-8c962f120747\") " Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357863 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.357886 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.358076 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.358091 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.358386 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.358831 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.359018 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.365002 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f1e160-28c6-4ff5-8b24-8c962f120747-kube-api-access-dfkqm" (OuterVolumeSpecName: "kube-api-access-dfkqm") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "kube-api-access-dfkqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.365065 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.365308 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.366004 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.374564 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.375006 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.375169 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.375374 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.375487 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e0f1e160-28c6-4ff5-8b24-8c962f120747" (UID: "e0f1e160-28c6-4ff5-8b24-8c962f120747"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.458998 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfkqm\" (UniqueName: \"kubernetes.io/projected/e0f1e160-28c6-4ff5-8b24-8c962f120747-kube-api-access-dfkqm\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.459280 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.459416 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.459521 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.459611 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.459727 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.459815 4906 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.459894 4906 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.460034 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.460135 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.460252 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.460350 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0f1e160-28c6-4ff5-8b24-8c962f120747-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.533222 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" event={"ID":"e0f1e160-28c6-4ff5-8b24-8c962f120747","Type":"ContainerDied","Data":"10d2d9960efef7d1474dce7618c81a89160135371788af09ee874e16d051f65e"} Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.533275 4906 scope.go:117] "RemoveContainer" containerID="20e4fbf06d46e679cf87d69a35a81277af7a8c3e978e9ad9b1a12951f78dade2" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.533884 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9kg2j" Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.581259 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9kg2j"] Feb 21 00:11:21 crc kubenswrapper[4906]: I0221 00:11:21.584176 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9kg2j"] Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858378 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5678f9c799-jm87l"] Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858617 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858635 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858656 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858664 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858675 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="extract-utilities" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858703 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="extract-utilities" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858714 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="extract-content" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858723 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="extract-content" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858733 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="extract-content" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858742 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="extract-content" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858754 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858761 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858769 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="extract-utilities" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858777 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="extract-utilities" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858786 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="extract-utilities" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858794 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="extract-utilities" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858806 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="extract-content" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858814 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="extract-content" Feb 21 00:11:22 crc kubenswrapper[4906]: E0221 00:11:22.858842 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f1e160-28c6-4ff5-8b24-8c962f120747" containerName="oauth-openshift" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858850 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f1e160-28c6-4ff5-8b24-8c962f120747" containerName="oauth-openshift" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858970 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb5d164-4d17-4181-b6a8-765003ec5f57" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.858986 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="73edbe76-4cd4-45e2-903c-8329ddfa9f58" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.859001 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f1e160-28c6-4ff5-8b24-8c962f120747" containerName="oauth-openshift" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.859011 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69b5bb2-d2af-4aa6-a35e-9c96b3f78622" containerName="registry-server" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.859420 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.862955 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.863030 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.863231 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.863493 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.863625 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.864059 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.865403 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.865505 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.865553 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.865576 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.867768 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.868336 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.880243 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.881422 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5678f9c799-jm87l"] Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.883199 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.890630 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981081 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981234 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981350 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-session\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981519 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981625 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981709 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-error\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981772 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9qfl\" (UniqueName: \"kubernetes.io/projected/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-kube-api-access-h9qfl\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981808 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981846 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-audit-policies\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981897 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-audit-dir\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.981931 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-login\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.982075 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.982119 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:22 crc kubenswrapper[4906]: I0221 00:11:22.982159 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084008 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084086 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084116 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-session\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084155 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084187 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084209 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-error\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084236 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9qfl\" (UniqueName: \"kubernetes.io/projected/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-kube-api-access-h9qfl\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084256 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084283 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-audit-policies\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084311 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-audit-dir\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084332 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-login\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084399 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084424 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.084451 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.085600 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.085609 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-audit-dir\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.085816 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-audit-policies\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.086167 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.086416 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.089774 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-session\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.091032 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-login\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.091562 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.091817 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.091830 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.092146 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-user-template-error\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.092401 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.093833 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.113239 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9qfl\" (UniqueName: \"kubernetes.io/projected/5c4379f6-d7b1-4ff2-b5c8-804656b85e0b-kube-api-access-h9qfl\") pod \"oauth-openshift-5678f9c799-jm87l\" (UID: \"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b\") " pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.178243 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.402659 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5678f9c799-jm87l"] Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.528260 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f1e160-28c6-4ff5-8b24-8c962f120747" path="/var/lib/kubelet/pods/e0f1e160-28c6-4ff5-8b24-8c962f120747/volumes" Feb 21 00:11:23 crc kubenswrapper[4906]: I0221 00:11:23.554945 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" event={"ID":"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b","Type":"ContainerStarted","Data":"857673cec73fa5dd65d0957924fb738d28bb5cf04c30bf490af2a5113e742dd7"} Feb 21 00:11:24 crc kubenswrapper[4906]: I0221 00:11:24.308816 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:11:24 crc kubenswrapper[4906]: I0221 00:11:24.350233 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:11:24 crc kubenswrapper[4906]: I0221 00:11:24.453205 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:11:24 crc kubenswrapper[4906]: I0221 00:11:24.453258 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:11:24 crc kubenswrapper[4906]: I0221 00:11:24.495635 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:11:24 crc kubenswrapper[4906]: I0221 00:11:24.620600 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:11:25 crc kubenswrapper[4906]: I0221 00:11:25.568768 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" event={"ID":"5c4379f6-d7b1-4ff2-b5c8-804656b85e0b","Type":"ContainerStarted","Data":"4fc226c8ea2b8343e8c3e55ca972979a90a4dc914e68c1616b50c43eb4fe677c"} Feb 21 00:11:25 crc kubenswrapper[4906]: I0221 00:11:25.569075 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:25 crc kubenswrapper[4906]: I0221 00:11:25.576059 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" Feb 21 00:11:25 crc kubenswrapper[4906]: I0221 00:11:25.591155 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5678f9c799-jm87l" podStartSLOduration=31.591132674 podStartE2EDuration="31.591132674s" podCreationTimestamp="2026-02-21 00:10:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:11:25.590510496 +0000 UTC m=+220.842098012" watchObservedRunningTime="2026-02-21 00:11:25.591132674 +0000 UTC m=+220.842720200" Feb 21 00:11:27 crc kubenswrapper[4906]: I0221 00:11:27.022159 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:11:27 crc kubenswrapper[4906]: I0221 00:11:27.551656 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:11:27 crc kubenswrapper[4906]: I0221 00:11:27.554336 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpk7r"] Feb 21 00:11:27 crc kubenswrapper[4906]: I0221 00:11:27.586306 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gpk7r" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerName="registry-server" containerID="cri-o://1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998" gracePeriod=2 Feb 21 00:11:27 crc kubenswrapper[4906]: I0221 00:11:27.623638 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:11:27 crc kubenswrapper[4906]: E0221 00:11:27.730932 4906 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ef311b_1a77_4d80_be38_637a6aa7de11.slice/crio-1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998.scope\": RecentStats: unable to find data in memory cache]" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.546723 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.624709 4906 generic.go:334] "Generic (PLEG): container finished" podID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerID="1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998" exitCode=0 Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.624747 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpk7r" event={"ID":"92ef311b-1a77-4d80-be38-637a6aa7de11","Type":"ContainerDied","Data":"1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998"} Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.624789 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpk7r" event={"ID":"92ef311b-1a77-4d80-be38-637a6aa7de11","Type":"ContainerDied","Data":"f4e8c6590bb14669f8fb31fc1a7c6927be9861bed902e62e522e3f61bc03e342"} Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.624807 4906 scope.go:117] "RemoveContainer" containerID="1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.624816 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpk7r" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.640471 4906 scope.go:117] "RemoveContainer" containerID="4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.655848 4906 scope.go:117] "RemoveContainer" containerID="28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.672534 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-utilities\") pod \"92ef311b-1a77-4d80-be38-637a6aa7de11\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.672608 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-catalog-content\") pod \"92ef311b-1a77-4d80-be38-637a6aa7de11\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.672653 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfstm\" (UniqueName: \"kubernetes.io/projected/92ef311b-1a77-4d80-be38-637a6aa7de11-kube-api-access-vfstm\") pod \"92ef311b-1a77-4d80-be38-637a6aa7de11\" (UID: \"92ef311b-1a77-4d80-be38-637a6aa7de11\") " Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.674734 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-utilities" (OuterVolumeSpecName: "utilities") pod "92ef311b-1a77-4d80-be38-637a6aa7de11" (UID: "92ef311b-1a77-4d80-be38-637a6aa7de11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.677849 4906 scope.go:117] "RemoveContainer" containerID="1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998" Feb 21 00:11:28 crc kubenswrapper[4906]: E0221 00:11:28.678577 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998\": container with ID starting with 1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998 not found: ID does not exist" containerID="1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.678611 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998"} err="failed to get container status \"1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998\": rpc error: code = NotFound desc = could not find container \"1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998\": container with ID starting with 1c6e99ab906f09dbb4e94d02e19e75ed740f650bb0efd4097ee7eced30605998 not found: ID does not exist" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.678633 4906 scope.go:117] "RemoveContainer" containerID="4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd" Feb 21 00:11:28 crc kubenswrapper[4906]: E0221 00:11:28.679809 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd\": container with ID starting with 4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd not found: ID does not exist" containerID="4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.679834 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd"} err="failed to get container status \"4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd\": rpc error: code = NotFound desc = could not find container \"4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd\": container with ID starting with 4be5ab5bcbcdf5618c9da199620d9d10bd3bf05a9a9539523ebeedc86a0d5efd not found: ID does not exist" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.679849 4906 scope.go:117] "RemoveContainer" containerID="28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad" Feb 21 00:11:28 crc kubenswrapper[4906]: E0221 00:11:28.680194 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad\": container with ID starting with 28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad not found: ID does not exist" containerID="28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.680214 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad"} err="failed to get container status \"28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad\": rpc error: code = NotFound desc = could not find container \"28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad\": container with ID starting with 28bc1c0d1e8952af5d188c26e1a40de18d4d88c1ea53407d15c2dd0c4f93dbad not found: ID does not exist" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.681983 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ef311b-1a77-4d80-be38-637a6aa7de11-kube-api-access-vfstm" (OuterVolumeSpecName: "kube-api-access-vfstm") pod "92ef311b-1a77-4d80-be38-637a6aa7de11" (UID: "92ef311b-1a77-4d80-be38-637a6aa7de11"). InnerVolumeSpecName "kube-api-access-vfstm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.700028 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92ef311b-1a77-4d80-be38-637a6aa7de11" (UID: "92ef311b-1a77-4d80-be38-637a6aa7de11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.773760 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.773811 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ef311b-1a77-4d80-be38-637a6aa7de11-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.773828 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfstm\" (UniqueName: \"kubernetes.io/projected/92ef311b-1a77-4d80-be38-637a6aa7de11-kube-api-access-vfstm\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.954355 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpk7r"] Feb 21 00:11:28 crc kubenswrapper[4906]: I0221 00:11:28.960369 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpk7r"] Feb 21 00:11:29 crc kubenswrapper[4906]: I0221 00:11:29.526771 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" path="/var/lib/kubelet/pods/92ef311b-1a77-4d80-be38-637a6aa7de11/volumes" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.212084 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zf44m"] Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.212653 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zf44m" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerName="registry-server" containerID="cri-o://dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47" gracePeriod=30 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.232674 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ptwns"] Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.233010 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ptwns" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="registry-server" containerID="cri-o://541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8" gracePeriod=30 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.236713 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fm5j7"] Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.237019 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" containerName="marketplace-operator" containerID="cri-o://ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0" gracePeriod=30 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.275258 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mlvx"] Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.276116 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5mlvx" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerName="registry-server" containerID="cri-o://373b2f2b11d9dd1ff16703493e1fea0ab03355eb78f2b61b6697c4a3bc4eee9e" gracePeriod=30 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.294982 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bpdlr"] Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.295410 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bpdlr" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="registry-server" containerID="cri-o://c9ca58e8f0041907d9d30a883af56b57fe0f3e7a2f42f1534e698c32d0526b13" gracePeriod=30 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.307792 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4zrd"] Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.308090 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.308107 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.308145 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.308155 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.308171 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.308179 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.308315 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ef311b-1a77-4d80-be38-637a6aa7de11" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.308744 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.311753 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4zrd"] Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.461399 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/951dfb31-b021-468b-bbae-cd6077c0cfdd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.461465 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwx7z\" (UniqueName: \"kubernetes.io/projected/951dfb31-b021-468b-bbae-cd6077c0cfdd-kube-api-access-wwx7z\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.461495 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/951dfb31-b021-468b-bbae-cd6077c0cfdd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.562183 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/951dfb31-b021-468b-bbae-cd6077c0cfdd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.562250 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwx7z\" (UniqueName: \"kubernetes.io/projected/951dfb31-b021-468b-bbae-cd6077c0cfdd-kube-api-access-wwx7z\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.562274 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/951dfb31-b021-468b-bbae-cd6077c0cfdd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.564543 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/951dfb31-b021-468b-bbae-cd6077c0cfdd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.568791 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/951dfb31-b021-468b-bbae-cd6077c0cfdd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.583622 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwx7z\" (UniqueName: \"kubernetes.io/projected/951dfb31-b021-468b-bbae-cd6077c0cfdd-kube-api-access-wwx7z\") pod \"marketplace-operator-79b997595-h4zrd\" (UID: \"951dfb31-b021-468b-bbae-cd6077c0cfdd\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.662738 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.668082 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.672251 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.675734 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.728190 4906 generic.go:334] "Generic (PLEG): container finished" podID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" containerID="ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0" exitCode=0 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.728250 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" event={"ID":"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e","Type":"ContainerDied","Data":"ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0"} Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.728282 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" event={"ID":"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e","Type":"ContainerDied","Data":"7eb5ae6babfcab802c3e1c8f31b07512f65c6eaa1d3d67ec71baf3e3d6e451e8"} Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.728302 4906 scope.go:117] "RemoveContainer" containerID="ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.728521 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.735476 4906 generic.go:334] "Generic (PLEG): container finished" podID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerID="dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47" exitCode=0 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.735545 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf44m" event={"ID":"0c851d93-c82d-4d73-b456-47c8fa6e3f5d","Type":"ContainerDied","Data":"dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47"} Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.735576 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf44m" event={"ID":"0c851d93-c82d-4d73-b456-47c8fa6e3f5d","Type":"ContainerDied","Data":"4c58adc2e5b77a6204057f65839f7f4a9ad5a0bbac1380a769728daf824e331d"} Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.735651 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zf44m" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.738526 4906 generic.go:334] "Generic (PLEG): container finished" podID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerID="373b2f2b11d9dd1ff16703493e1fea0ab03355eb78f2b61b6697c4a3bc4eee9e" exitCode=0 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.738563 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mlvx" event={"ID":"06df1df6-9fbb-40ed-a746-f90f10d5286f","Type":"ContainerDied","Data":"373b2f2b11d9dd1ff16703493e1fea0ab03355eb78f2b61b6697c4a3bc4eee9e"} Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.741067 4906 generic.go:334] "Generic (PLEG): container finished" podID="2ac4beba-19e0-4278-9701-d830f33d6688" containerID="541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8" exitCode=0 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.741204 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptwns" event={"ID":"2ac4beba-19e0-4278-9701-d830f33d6688","Type":"ContainerDied","Data":"541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8"} Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.741313 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptwns" event={"ID":"2ac4beba-19e0-4278-9701-d830f33d6688","Type":"ContainerDied","Data":"2a75c1f1ecf8eba4e78df89cb0639ffbdf3a5066a258dc84693ac107fa6aaa8f"} Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.741406 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptwns" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.748628 4906 generic.go:334] "Generic (PLEG): container finished" podID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerID="c9ca58e8f0041907d9d30a883af56b57fe0f3e7a2f42f1534e698c32d0526b13" exitCode=0 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.748677 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpdlr" event={"ID":"e2cdeb59-eeb8-4641-921e-05a840fd96fa","Type":"ContainerDied","Data":"c9ca58e8f0041907d9d30a883af56b57fe0f3e7a2f42f1534e698c32d0526b13"} Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.755358 4906 scope.go:117] "RemoveContainer" containerID="ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.755988 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0\": container with ID starting with ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0 not found: ID does not exist" containerID="ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.756303 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0"} err="failed to get container status \"ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0\": rpc error: code = NotFound desc = could not find container \"ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0\": container with ID starting with ee3d8b4f03b7ffb13a7aec2aaab5cbcc97bfa18e06cbac3bab17d0f29f7eafc0 not found: ID does not exist" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.756335 4906 scope.go:117] "RemoveContainer" containerID="dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.758907 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.765367 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-trusted-ca\") pod \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.765420 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9flm\" (UniqueName: \"kubernetes.io/projected/2ac4beba-19e0-4278-9701-d830f33d6688-kube-api-access-q9flm\") pod \"2ac4beba-19e0-4278-9701-d830f33d6688\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.765524 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcnsh\" (UniqueName: \"kubernetes.io/projected/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-kube-api-access-bcnsh\") pod \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.766326 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" (UID: "16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.765584 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-utilities\") pod \"2ac4beba-19e0-4278-9701-d830f33d6688\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.766708 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-utilities\") pod \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.766853 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpc5q\" (UniqueName: \"kubernetes.io/projected/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-kube-api-access-wpc5q\") pod \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.766900 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-catalog-content\") pod \"2ac4beba-19e0-4278-9701-d830f33d6688\" (UID: \"2ac4beba-19e0-4278-9701-d830f33d6688\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.766929 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-operator-metrics\") pod \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\" (UID: \"16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.766968 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-catalog-content\") pod \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\" (UID: \"0c851d93-c82d-4d73-b456-47c8fa6e3f5d\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.767240 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-utilities" (OuterVolumeSpecName: "utilities") pod "2ac4beba-19e0-4278-9701-d830f33d6688" (UID: "2ac4beba-19e0-4278-9701-d830f33d6688"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.768025 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.768318 4906 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.768945 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-utilities" (OuterVolumeSpecName: "utilities") pod "0c851d93-c82d-4d73-b456-47c8fa6e3f5d" (UID: "0c851d93-c82d-4d73-b456-47c8fa6e3f5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.770157 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac4beba-19e0-4278-9701-d830f33d6688-kube-api-access-q9flm" (OuterVolumeSpecName: "kube-api-access-q9flm") pod "2ac4beba-19e0-4278-9701-d830f33d6688" (UID: "2ac4beba-19e0-4278-9701-d830f33d6688"). InnerVolumeSpecName "kube-api-access-q9flm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.770212 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.770613 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-kube-api-access-bcnsh" (OuterVolumeSpecName: "kube-api-access-bcnsh") pod "0c851d93-c82d-4d73-b456-47c8fa6e3f5d" (UID: "0c851d93-c82d-4d73-b456-47c8fa6e3f5d"). InnerVolumeSpecName "kube-api-access-bcnsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.771665 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" (UID: "16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.785001 4906 scope.go:117] "RemoveContainer" containerID="55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.790663 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-kube-api-access-wpc5q" (OuterVolumeSpecName: "kube-api-access-wpc5q") pod "16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" (UID: "16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e"). InnerVolumeSpecName "kube-api-access-wpc5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.810811 4906 scope.go:117] "RemoveContainer" containerID="bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.813073 4906 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.813390 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8" gracePeriod=15 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.813490 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58" gracePeriod=15 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.813530 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5" gracePeriod=15 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.813558 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c" gracePeriod=15 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.813541 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa" gracePeriod=15 Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.821879 4906 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824272 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824382 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824395 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824401 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824408 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824414 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824422 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824428 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824445 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824451 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824461 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824469 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824479 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824486 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824493 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824499 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824507 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824513 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824522 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824529 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824536 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824542 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824548 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824554 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824565 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" containerName="marketplace-operator" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824571 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" containerName="marketplace-operator" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824583 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824589 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="extract-content" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824599 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824604 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerName="extract-utilities" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824611 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824616 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824624 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824630 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824638 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824644 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824650 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824656 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.824664 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824670 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824771 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824781 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824789 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824797 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824803 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824814 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" containerName="marketplace-operator" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824821 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824828 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824834 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" containerName="registry-server" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824841 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.824848 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.833707 4906 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.836200 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.841072 4906 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.858921 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c851d93-c82d-4d73-b456-47c8fa6e3f5d" (UID: "0c851d93-c82d-4d73-b456-47c8fa6e3f5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.859647 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ac4beba-19e0-4278-9701-d830f33d6688" (UID: "2ac4beba-19e0-4278-9701-d830f33d6688"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882345 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-utilities\") pod \"06df1df6-9fbb-40ed-a746-f90f10d5286f\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882427 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpsnw\" (UniqueName: \"kubernetes.io/projected/06df1df6-9fbb-40ed-a746-f90f10d5286f-kube-api-access-bpsnw\") pod \"06df1df6-9fbb-40ed-a746-f90f10d5286f\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882452 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-catalog-content\") pod \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882503 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb2cm\" (UniqueName: \"kubernetes.io/projected/e2cdeb59-eeb8-4641-921e-05a840fd96fa-kube-api-access-fb2cm\") pod \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882560 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-catalog-content\") pod \"06df1df6-9fbb-40ed-a746-f90f10d5286f\" (UID: \"06df1df6-9fbb-40ed-a746-f90f10d5286f\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882583 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-utilities\") pod \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\" (UID: \"e2cdeb59-eeb8-4641-921e-05a840fd96fa\") " Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882911 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9flm\" (UniqueName: \"kubernetes.io/projected/2ac4beba-19e0-4278-9701-d830f33d6688-kube-api-access-q9flm\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882933 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcnsh\" (UniqueName: \"kubernetes.io/projected/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-kube-api-access-bcnsh\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882946 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882958 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpc5q\" (UniqueName: \"kubernetes.io/projected/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-kube-api-access-wpc5q\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882970 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ac4beba-19e0-4278-9701-d830f33d6688-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882983 4906 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.882997 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c851d93-c82d-4d73-b456-47c8fa6e3f5d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.883437 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-utilities" (OuterVolumeSpecName: "utilities") pod "06df1df6-9fbb-40ed-a746-f90f10d5286f" (UID: "06df1df6-9fbb-40ed-a746-f90f10d5286f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.886804 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-utilities" (OuterVolumeSpecName: "utilities") pod "e2cdeb59-eeb8-4641-921e-05a840fd96fa" (UID: "e2cdeb59-eeb8-4641-921e-05a840fd96fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.899986 4906 scope.go:117] "RemoveContainer" containerID="dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.900292 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06df1df6-9fbb-40ed-a746-f90f10d5286f-kube-api-access-bpsnw" (OuterVolumeSpecName: "kube-api-access-bpsnw") pod "06df1df6-9fbb-40ed-a746-f90f10d5286f" (UID: "06df1df6-9fbb-40ed-a746-f90f10d5286f"). InnerVolumeSpecName "kube-api-access-bpsnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.901613 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cdeb59-eeb8-4641-921e-05a840fd96fa-kube-api-access-fb2cm" (OuterVolumeSpecName: "kube-api-access-fb2cm") pod "e2cdeb59-eeb8-4641-921e-05a840fd96fa" (UID: "e2cdeb59-eeb8-4641-921e-05a840fd96fa"). InnerVolumeSpecName "kube-api-access-fb2cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.901932 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47\": container with ID starting with dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47 not found: ID does not exist" containerID="dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.902009 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47"} err="failed to get container status \"dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47\": rpc error: code = NotFound desc = could not find container \"dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47\": container with ID starting with dd5d961bb852f6b1e00335661d1386af3f7e62cfb894648e87ac4bd918ba2c47 not found: ID does not exist" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.902061 4906 scope.go:117] "RemoveContainer" containerID="55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.902837 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f\": container with ID starting with 55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f not found: ID does not exist" containerID="55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.903206 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f"} err="failed to get container status \"55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f\": rpc error: code = NotFound desc = could not find container \"55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f\": container with ID starting with 55da5e8887635f4de079f3f8ce939427332f162722f9ca10b74dddba93ec093f not found: ID does not exist" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.903397 4906 scope.go:117] "RemoveContainer" containerID="bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71" Feb 21 00:11:42 crc kubenswrapper[4906]: E0221 00:11:42.904350 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71\": container with ID starting with bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71 not found: ID does not exist" containerID="bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.904372 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71"} err="failed to get container status \"bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71\": rpc error: code = NotFound desc = could not find container \"bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71\": container with ID starting with bbbfe5e1c99edf158d5b26a4e5af605ad0ebba3afc6be58bdfd011bb47f32f71 not found: ID does not exist" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.904388 4906 scope.go:117] "RemoveContainer" containerID="541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.911712 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06df1df6-9fbb-40ed-a746-f90f10d5286f" (UID: "06df1df6-9fbb-40ed-a746-f90f10d5286f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.943957 4906 scope.go:117] "RemoveContainer" containerID="eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.988790 4906 scope.go:117] "RemoveContainer" containerID="e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.988942 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989068 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989127 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989179 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989202 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989221 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989281 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989303 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989388 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989423 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpsnw\" (UniqueName: \"kubernetes.io/projected/06df1df6-9fbb-40ed-a746-f90f10d5286f-kube-api-access-bpsnw\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989440 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb2cm\" (UniqueName: \"kubernetes.io/projected/e2cdeb59-eeb8-4641-921e-05a840fd96fa-kube-api-access-fb2cm\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989450 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06df1df6-9fbb-40ed-a746-f90f10d5286f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:42 crc kubenswrapper[4906]: I0221 00:11:42.989459 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.002081 4906 scope.go:117] "RemoveContainer" containerID="541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8" Feb 21 00:11:43 crc kubenswrapper[4906]: E0221 00:11:43.005074 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8\": container with ID starting with 541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8 not found: ID does not exist" containerID="541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.005131 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8"} err="failed to get container status \"541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8\": rpc error: code = NotFound desc = could not find container \"541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8\": container with ID starting with 541b36279f5c49d0d7dddf3cfa15d35e92e940ae29bf529ca338250fa6074da8 not found: ID does not exist" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.005160 4906 scope.go:117] "RemoveContainer" containerID="eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547" Feb 21 00:11:43 crc kubenswrapper[4906]: E0221 00:11:43.005504 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547\": container with ID starting with eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547 not found: ID does not exist" containerID="eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.005523 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547"} err="failed to get container status \"eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547\": rpc error: code = NotFound desc = could not find container \"eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547\": container with ID starting with eb693f02ade8c902a0ede7390f4f45471ee1a6fb53c814fcc67288d51b880547 not found: ID does not exist" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.005537 4906 scope.go:117] "RemoveContainer" containerID="e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb" Feb 21 00:11:43 crc kubenswrapper[4906]: E0221 00:11:43.005949 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb\": container with ID starting with e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb not found: ID does not exist" containerID="e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.005982 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb"} err="failed to get container status \"e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb\": rpc error: code = NotFound desc = could not find container \"e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb\": container with ID starting with e6176edb28add9bd21bed334cd7e3c98e406d1ab2ab483ff476938b64e1ae8cb not found: ID does not exist" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.035387 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2cdeb59-eeb8-4641-921e-05a840fd96fa" (UID: "e2cdeb59-eeb8-4641-921e-05a840fd96fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.042495 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.053487 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.053863 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.064853 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.065165 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.065519 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090660 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090755 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090764 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090801 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090829 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090856 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090900 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090960 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090988 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.090987 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.091017 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.091041 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.091055 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.091096 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.091147 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2cdeb59-eeb8-4641-921e-05a840fd96fa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.091147 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.091166 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:43 crc kubenswrapper[4906]: E0221 00:11:43.385659 4906 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 21 00:11:43 crc kubenswrapper[4906]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9" Netns:"/var/run/netns/bc20bec0-4168-4cd0-ba32-9ff35c88c94c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:11:43 crc kubenswrapper[4906]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 21 00:11:43 crc kubenswrapper[4906]: > Feb 21 00:11:43 crc kubenswrapper[4906]: E0221 00:11:43.386085 4906 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 21 00:11:43 crc kubenswrapper[4906]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9" Netns:"/var/run/netns/bc20bec0-4168-4cd0-ba32-9ff35c88c94c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:11:43 crc kubenswrapper[4906]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 21 00:11:43 crc kubenswrapper[4906]: > pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:43 crc kubenswrapper[4906]: E0221 00:11:43.386108 4906 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 21 00:11:43 crc kubenswrapper[4906]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9" Netns:"/var/run/netns/bc20bec0-4168-4cd0-ba32-9ff35c88c94c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:11:43 crc kubenswrapper[4906]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 21 00:11:43 crc kubenswrapper[4906]: > pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:43 crc kubenswrapper[4906]: E0221 00:11:43.386163 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"marketplace-operator-79b997595-h4zrd_openshift-marketplace(951dfb31-b021-468b-bbae-cd6077c0cfdd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"marketplace-operator-79b997595-h4zrd_openshift-marketplace(951dfb31-b021-468b-bbae-cd6077c0cfdd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9\\\" Netns:\\\"/var/run/netns/bc20bec0-4168-4cd0-ba32-9ff35c88c94c\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s\\\": dial tcp 38.102.83.136:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" podUID="951dfb31-b021-468b-bbae-cd6077c0cfdd" Feb 21 00:11:43 crc kubenswrapper[4906]: E0221 00:11:43.386520 4906 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event=< Feb 21 00:11:43 crc kubenswrapper[4906]: &Event{ObjectMeta:{marketplace-operator-79b997595-h4zrd.18961a8ea5cb7f9b openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-h4zrd,UID:951dfb31-b021-468b-bbae-cd6077c0cfdd,APIVersion:v1,ResourceVersion:29604,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9" Netns:"/var/run/netns/bc20bec0-4168-4cd0-ba32-9ff35c88c94c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:11:43 crc kubenswrapper[4906]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 00:11:43.386124187 +0000 UTC m=+238.637711693,LastTimestamp:2026-02-21 00:11:43.386124187 +0000 UTC m=+238.637711693,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 21 00:11:43 crc kubenswrapper[4906]: > Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.756960 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mlvx" event={"ID":"06df1df6-9fbb-40ed-a746-f90f10d5286f","Type":"ContainerDied","Data":"d15f2d0777542ea02e91b3c6c3a21e171c7d6701a27fee9ceea650bcda6dba31"} Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.757006 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mlvx" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.757011 4906 scope.go:117] "RemoveContainer" containerID="373b2f2b11d9dd1ff16703493e1fea0ab03355eb78f2b61b6697c4a3bc4eee9e" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.757640 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.757818 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.758550 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.759919 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.760619 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.760841 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.761101 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.761323 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.762172 4906 generic.go:334] "Generic (PLEG): container finished" podID="03720e31-8028-44b3-9bc7-54cb0474a821" containerID="70db0738c93dba9222aeb95ece51e5d21a3869ce63dead82a6cba5dad27d36d1" exitCode=0 Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.762549 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"03720e31-8028-44b3-9bc7-54cb0474a821","Type":"ContainerDied","Data":"70db0738c93dba9222aeb95ece51e5d21a3869ce63dead82a6cba5dad27d36d1"} Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.763808 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.764056 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.764313 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.764547 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.764799 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.765529 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.776865 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.778227 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5" exitCode=0 Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.778245 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58" exitCode=0 Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.778252 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa" exitCode=0 Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.778259 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c" exitCode=2 Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.778950 4906 scope.go:117] "RemoveContainer" containerID="118efcbdcfc1710531b17df1d6a1ca120bceee20bc949f8d3a5bab010d22e928" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.780561 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.780989 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.781209 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpdlr" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.781629 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpdlr" event={"ID":"e2cdeb59-eeb8-4641-921e-05a840fd96fa","Type":"ContainerDied","Data":"f1d09f215e5e4cc771d74cc35b06f8bfd5d741c1fcf9f2ce3f2492494d028bd1"} Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.782163 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.782972 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.783559 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.783812 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.784013 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.784213 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.785713 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.786024 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.786404 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.786635 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.786842 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.787030 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.796829 4906 scope.go:117] "RemoveContainer" containerID="9a86890cabd0382954ee315ea5004a7295d3613ec572aae2e8908ce2e1f1741b" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.838321 4906 scope.go:117] "RemoveContainer" containerID="104130e8d59df60560691070c803ca780c06ea5439f8d04c160d53df2042c41f" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.869385 4906 scope.go:117] "RemoveContainer" containerID="c9ca58e8f0041907d9d30a883af56b57fe0f3e7a2f42f1534e698c32d0526b13" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.886630 4906 scope.go:117] "RemoveContainer" containerID="44f8ac25733dd0e112cf8018d266ce62b77b4a5ab875a018e9a5ddf2ec81c413" Feb 21 00:11:43 crc kubenswrapper[4906]: I0221 00:11:43.911552 4906 scope.go:117] "RemoveContainer" containerID="34f7392d91469153a0a564d2ca3729fcb3e96e04bcd10881fed0721d8a57a73f" Feb 21 00:11:44 crc kubenswrapper[4906]: E0221 00:11:44.353558 4906 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 21 00:11:44 crc kubenswrapper[4906]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a" Netns:"/var/run/netns/ba061d9c-2984-453f-9151-ddfe61877a0f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:11:44 crc kubenswrapper[4906]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 21 00:11:44 crc kubenswrapper[4906]: > Feb 21 00:11:44 crc kubenswrapper[4906]: E0221 00:11:44.353843 4906 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 21 00:11:44 crc kubenswrapper[4906]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a" Netns:"/var/run/netns/ba061d9c-2984-453f-9151-ddfe61877a0f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:11:44 crc kubenswrapper[4906]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 21 00:11:44 crc kubenswrapper[4906]: > pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:44 crc kubenswrapper[4906]: E0221 00:11:44.353864 4906 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 21 00:11:44 crc kubenswrapper[4906]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a" Netns:"/var/run/netns/ba061d9c-2984-453f-9151-ddfe61877a0f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:11:44 crc kubenswrapper[4906]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 21 00:11:44 crc kubenswrapper[4906]: > pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:44 crc kubenswrapper[4906]: E0221 00:11:44.353918 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"marketplace-operator-79b997595-h4zrd_openshift-marketplace(951dfb31-b021-468b-bbae-cd6077c0cfdd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"marketplace-operator-79b997595-h4zrd_openshift-marketplace(951dfb31-b021-468b-bbae-cd6077c0cfdd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a\\\" Netns:\\\"/var/run/netns/ba061d9c-2984-453f-9151-ddfe61877a0f\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=86c2bd1bb356e5d5bf3977e935be33a8a2c4d7145aaf5efba893561a12b74a4a;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s\\\": dial tcp 38.102.83.136:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" podUID="951dfb31-b021-468b-bbae-cd6077c0cfdd" Feb 21 00:11:44 crc kubenswrapper[4906]: I0221 00:11:44.794945 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.093564 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.094363 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.094603 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.094883 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.095107 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.095324 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.095547 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.203112 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.203889 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.204406 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.204724 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.204987 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.205242 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.205739 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.205998 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.206242 4906 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.221531 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-var-lock\") pod \"03720e31-8028-44b3-9bc7-54cb0474a821\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.221579 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03720e31-8028-44b3-9bc7-54cb0474a821-kube-api-access\") pod \"03720e31-8028-44b3-9bc7-54cb0474a821\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.221632 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-kubelet-dir\") pod \"03720e31-8028-44b3-9bc7-54cb0474a821\" (UID: \"03720e31-8028-44b3-9bc7-54cb0474a821\") " Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.221640 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-var-lock" (OuterVolumeSpecName: "var-lock") pod "03720e31-8028-44b3-9bc7-54cb0474a821" (UID: "03720e31-8028-44b3-9bc7-54cb0474a821"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.221708 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "03720e31-8028-44b3-9bc7-54cb0474a821" (UID: "03720e31-8028-44b3-9bc7-54cb0474a821"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.221902 4906 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-var-lock\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.221944 4906 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03720e31-8028-44b3-9bc7-54cb0474a821-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.229646 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03720e31-8028-44b3-9bc7-54cb0474a821-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "03720e31-8028-44b3-9bc7-54cb0474a821" (UID: "03720e31-8028-44b3-9bc7-54cb0474a821"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323005 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323098 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323170 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323159 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323271 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323362 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323610 4906 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323636 4906 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323654 4906 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.323672 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03720e31-8028-44b3-9bc7-54cb0474a821-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.731572 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.731872 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.732771 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.732958 4906 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.733113 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.733273 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.733429 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.739920 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.803154 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.803147 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"03720e31-8028-44b3-9bc7-54cb0474a821","Type":"ContainerDied","Data":"a62d0e6f2fe718e4b46b29a6528da5ce2d812c383085b4f94c4959a6fe7a7e7f"} Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.803947 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a62d0e6f2fe718e4b46b29a6528da5ce2d812c383085b4f94c4959a6fe7a7e7f" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.806577 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.807317 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8" exitCode=0 Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.807366 4906 scope.go:117] "RemoveContainer" containerID="8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.807370 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.807537 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.807801 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.808090 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.808450 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.808662 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.808943 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.809314 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.809706 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.809958 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.810294 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.810580 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.810854 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.811289 4906 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.823738 4906 scope.go:117] "RemoveContainer" containerID="767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.836565 4906 scope.go:117] "RemoveContainer" containerID="f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.857776 4906 scope.go:117] "RemoveContainer" containerID="1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.874777 4906 scope.go:117] "RemoveContainer" containerID="6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.896001 4906 scope.go:117] "RemoveContainer" containerID="088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.923265 4906 scope.go:117] "RemoveContainer" containerID="8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5" Feb 21 00:11:45 crc kubenswrapper[4906]: E0221 00:11:45.923644 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\": container with ID starting with 8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5 not found: ID does not exist" containerID="8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.923707 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5"} err="failed to get container status \"8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\": rpc error: code = NotFound desc = could not find container \"8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5\": container with ID starting with 8d527700428fdbde47902ae05e7592fb0f3b43af16fa72386eb4a1ff274156a5 not found: ID does not exist" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.923731 4906 scope.go:117] "RemoveContainer" containerID="767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58" Feb 21 00:11:45 crc kubenswrapper[4906]: E0221 00:11:45.925071 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\": container with ID starting with 767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58 not found: ID does not exist" containerID="767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.925098 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58"} err="failed to get container status \"767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\": rpc error: code = NotFound desc = could not find container \"767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58\": container with ID starting with 767a4e9711e2aebc34c46b0d348d258d691d5b4156461c66345cfb3d0511da58 not found: ID does not exist" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.925118 4906 scope.go:117] "RemoveContainer" containerID="f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa" Feb 21 00:11:45 crc kubenswrapper[4906]: E0221 00:11:45.926162 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\": container with ID starting with f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa not found: ID does not exist" containerID="f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.926189 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa"} err="failed to get container status \"f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\": rpc error: code = NotFound desc = could not find container \"f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa\": container with ID starting with f4a1c930c215ad818cdfdec25199f1b06135072fce64e0f6914926363331ffaa not found: ID does not exist" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.926207 4906 scope.go:117] "RemoveContainer" containerID="1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c" Feb 21 00:11:45 crc kubenswrapper[4906]: E0221 00:11:45.927080 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\": container with ID starting with 1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c not found: ID does not exist" containerID="1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.927124 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c"} err="failed to get container status \"1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\": rpc error: code = NotFound desc = could not find container \"1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c\": container with ID starting with 1e5f4037aab7b9f956d2bf8d7303a0962a34e9de0053d99384223db79369902c not found: ID does not exist" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.927137 4906 scope.go:117] "RemoveContainer" containerID="6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8" Feb 21 00:11:45 crc kubenswrapper[4906]: E0221 00:11:45.927498 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\": container with ID starting with 6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8 not found: ID does not exist" containerID="6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.927521 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8"} err="failed to get container status \"6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\": rpc error: code = NotFound desc = could not find container \"6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8\": container with ID starting with 6902284343c7855629a0d3815210f787b1dc974a92458e278a599339baa0cfd8 not found: ID does not exist" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.927535 4906 scope.go:117] "RemoveContainer" containerID="088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9" Feb 21 00:11:45 crc kubenswrapper[4906]: E0221 00:11:45.928798 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\": container with ID starting with 088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9 not found: ID does not exist" containerID="088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9" Feb 21 00:11:45 crc kubenswrapper[4906]: I0221 00:11:45.928829 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9"} err="failed to get container status \"088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\": rpc error: code = NotFound desc = could not find container \"088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9\": container with ID starting with 088cda40ad68d343694d43044a2d619f22cee82852bec2d1f9d010c485e5d1a9 not found: ID does not exist" Feb 21 00:11:46 crc kubenswrapper[4906]: E0221 00:11:46.177223 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:46 crc kubenswrapper[4906]: E0221 00:11:46.177525 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:46 crc kubenswrapper[4906]: E0221 00:11:46.177911 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:46 crc kubenswrapper[4906]: E0221 00:11:46.178398 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:46 crc kubenswrapper[4906]: E0221 00:11:46.178849 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:46 crc kubenswrapper[4906]: I0221 00:11:46.178908 4906 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 21 00:11:46 crc kubenswrapper[4906]: E0221 00:11:46.179202 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="200ms" Feb 21 00:11:46 crc kubenswrapper[4906]: E0221 00:11:46.380077 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="400ms" Feb 21 00:11:46 crc kubenswrapper[4906]: E0221 00:11:46.781655 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="800ms" Feb 21 00:11:47 crc kubenswrapper[4906]: E0221 00:11:47.504896 4906 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event=< Feb 21 00:11:47 crc kubenswrapper[4906]: &Event{ObjectMeta:{marketplace-operator-79b997595-h4zrd.18961a8ea5cb7f9b openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-h4zrd,UID:951dfb31-b021-468b-bbae-cd6077c0cfdd,APIVersion:v1,ResourceVersion:29604,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-h4zrd_openshift-marketplace_951dfb31-b021-468b-bbae-cd6077c0cfdd_0(9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9): error adding pod openshift-marketplace_marketplace-operator-79b997595-h4zrd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9" Netns:"/var/run/netns/bc20bec0-4168-4cd0-ba32-9ff35c88c94c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-h4zrd;K8S_POD_INFRA_CONTAINER_ID=9f55d54bb1c97817992c26c2136c1aba508ffea0272de069120868c82a78ecc9;K8S_POD_UID=951dfb31-b021-468b-bbae-cd6077c0cfdd" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-h4zrd] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-h4zrd/951dfb31-b021-468b-bbae-cd6077c0cfdd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-h4zrd in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-h4zrd?timeout=1m0s": dial tcp 38.102.83.136:6443: connect: connection refused Feb 21 00:11:47 crc kubenswrapper[4906]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 00:11:43.386124187 +0000 UTC m=+238.637711693,LastTimestamp:2026-02-21 00:11:43.386124187 +0000 UTC m=+238.637711693,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 21 00:11:47 crc kubenswrapper[4906]: > Feb 21 00:11:47 crc kubenswrapper[4906]: E0221 00:11:47.583045 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="1.6s" Feb 21 00:11:47 crc kubenswrapper[4906]: E0221 00:11:47.866948 4906 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:47 crc kubenswrapper[4906]: I0221 00:11:47.867514 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:47 crc kubenswrapper[4906]: W0221 00:11:47.897976 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-9e11e004582759e4956a73e33cd16622ef6c270ae4751ce51c6aca8d1d0f4146 WatchSource:0}: Error finding container 9e11e004582759e4956a73e33cd16622ef6c270ae4751ce51c6aca8d1d0f4146: Status 404 returned error can't find the container with id 9e11e004582759e4956a73e33cd16622ef6c270ae4751ce51c6aca8d1d0f4146 Feb 21 00:11:48 crc kubenswrapper[4906]: I0221 00:11:48.822421 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561"} Feb 21 00:11:48 crc kubenswrapper[4906]: I0221 00:11:48.823284 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9e11e004582759e4956a73e33cd16622ef6c270ae4751ce51c6aca8d1d0f4146"} Feb 21 00:11:48 crc kubenswrapper[4906]: E0221 00:11:48.824816 4906 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:11:48 crc kubenswrapper[4906]: I0221 00:11:48.825413 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:48 crc kubenswrapper[4906]: I0221 00:11:48.825664 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:48 crc kubenswrapper[4906]: I0221 00:11:48.826005 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:48 crc kubenswrapper[4906]: I0221 00:11:48.826276 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:48 crc kubenswrapper[4906]: I0221 00:11:48.826497 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:48 crc kubenswrapper[4906]: I0221 00:11:48.826749 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:49 crc kubenswrapper[4906]: E0221 00:11:49.183747 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="3.2s" Feb 21 00:11:52 crc kubenswrapper[4906]: E0221 00:11:52.384498 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="6.4s" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.517017 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.517769 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.518148 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.518432 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.518627 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.518832 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.519117 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.540352 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.540400 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:11:54 crc kubenswrapper[4906]: E0221 00:11:54.540916 4906 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.541392 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:54 crc kubenswrapper[4906]: W0221 00:11:54.564871 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-648738dafe0167cb33080a00f408a7665372c719b6bb6e822dd15613f93fca8f WatchSource:0}: Error finding container 648738dafe0167cb33080a00f408a7665372c719b6bb6e822dd15613f93fca8f: Status 404 returned error can't find the container with id 648738dafe0167cb33080a00f408a7665372c719b6bb6e822dd15613f93fca8f Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.853131 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc499dee443e5df9abff7cde255c0d043c50c62f53e5ca9b897ef2ac3bfa20eb"} Feb 21 00:11:54 crc kubenswrapper[4906]: I0221 00:11:54.853206 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"648738dafe0167cb33080a00f408a7665372c719b6bb6e822dd15613f93fca8f"} Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.526562 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.527343 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.528081 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.528534 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.528987 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.529540 4906 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.530070 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.862863 4906 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bc499dee443e5df9abff7cde255c0d043c50c62f53e5ca9b897ef2ac3bfa20eb" exitCode=0 Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.863182 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bc499dee443e5df9abff7cde255c0d043c50c62f53e5ca9b897ef2ac3bfa20eb"} Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.863606 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.863638 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:11:55 crc kubenswrapper[4906]: E0221 00:11:55.864344 4906 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.864520 4906 status_manager.go:851] "Failed to get status for pod" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" pod="openshift-marketplace/certified-operators-zf44m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-zf44m\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.865303 4906 status_manager.go:851] "Failed to get status for pod" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" pod="openshift-marketplace/redhat-marketplace-5mlvx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5mlvx\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.866776 4906 status_manager.go:851] "Failed to get status for pod" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" pod="openshift-marketplace/redhat-operators-bpdlr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-bpdlr\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.867228 4906 status_manager.go:851] "Failed to get status for pod" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" pod="openshift-marketplace/marketplace-operator-79b997595-fm5j7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-fm5j7\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.867789 4906 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.868275 4906 status_manager.go:851] "Failed to get status for pod" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" pod="openshift-marketplace/community-operators-ptwns" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ptwns\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:55 crc kubenswrapper[4906]: I0221 00:11:55.868786 4906 status_manager.go:851] "Failed to get status for pod" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 21 00:11:56 crc kubenswrapper[4906]: I0221 00:11:56.871315 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e01b6106066a02a6f19ac4ee0c09950e8083fc559f51ed9aad95bdc55c019078"} Feb 21 00:11:56 crc kubenswrapper[4906]: I0221 00:11:56.872220 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e9233203ad530064847011132a454baf6ba7be8a8b48c9dc7d105a16b5133222"} Feb 21 00:11:56 crc kubenswrapper[4906]: I0221 00:11:56.872332 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"66a3f8b1c2cc97e57d5df1e24d2f3b6f49232b0f5b34912658035ae7c64751f4"} Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.517146 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.517937 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.879235 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.879453 4906 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac" exitCode=1 Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.879500 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac"} Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.879915 4906 scope.go:117] "RemoveContainer" containerID="57d4b213ed7600da4237ff1c423d004e3afdaff5f599b453398266f6cdae16ac" Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.883045 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6cd17d567d5ad9c397e08e2a2284faa15c24ac969f9cb8bb7afabb0cccf90306"} Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.883074 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8e99a9c2e2626c5b2f2a61478ed384e073f761b69c50a831730fbadbfb2743fd"} Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.883255 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.883271 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:11:57 crc kubenswrapper[4906]: I0221 00:11:57.883451 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:58 crc kubenswrapper[4906]: I0221 00:11:58.893797 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 21 00:11:58 crc kubenswrapper[4906]: I0221 00:11:58.893904 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f848928f673afbaa6c333d1126b916db71631f6ea126b8ee99b7014b0369ccc2"} Feb 21 00:11:59 crc kubenswrapper[4906]: I0221 00:11:59.542137 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:59 crc kubenswrapper[4906]: I0221 00:11:59.542576 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:11:59 crc kubenswrapper[4906]: I0221 00:11:59.551986 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:12:01 crc kubenswrapper[4906]: I0221 00:12:01.529733 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:12:02 crc kubenswrapper[4906]: W0221 00:12:02.777847 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod951dfb31_b021_468b_bbae_cd6077c0cfdd.slice/crio-144bf7ffd2ff6aea2cf918647917c4d8e3db6188b053cf664ebaebec74f1b1ec WatchSource:0}: Error finding container 144bf7ffd2ff6aea2cf918647917c4d8e3db6188b053cf664ebaebec74f1b1ec: Status 404 returned error can't find the container with id 144bf7ffd2ff6aea2cf918647917c4d8e3db6188b053cf664ebaebec74f1b1ec Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.891295 4906 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.930806 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.932327 4906 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="6cd17d567d5ad9c397e08e2a2284faa15c24ac969f9cb8bb7afabb0cccf90306" exitCode=255 Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.932393 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"6cd17d567d5ad9c397e08e2a2284faa15c24ac969f9cb8bb7afabb0cccf90306"} Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.932678 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.932714 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.934365 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" event={"ID":"951dfb31-b021-468b-bbae-cd6077c0cfdd","Type":"ContainerStarted","Data":"7abc0a17b918ca08159d4edbf9db923fb1e233390643ed9428074899c43b8066"} Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.934398 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" event={"ID":"951dfb31-b021-468b-bbae-cd6077c0cfdd","Type":"ContainerStarted","Data":"144bf7ffd2ff6aea2cf918647917c4d8e3db6188b053cf664ebaebec74f1b1ec"} Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.936516 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.937716 4906 scope.go:117] "RemoveContainer" containerID="6cd17d567d5ad9c397e08e2a2284faa15c24ac969f9cb8bb7afabb0cccf90306" Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.939667 4906 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-h4zrd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.58:8080/healthz\": dial tcp 10.217.0.58:8080: connect: connection refused" start-of-body= Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.939757 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" podUID="951dfb31-b021-468b-bbae-cd6077c0cfdd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.58:8080/healthz\": dial tcp 10.217.0.58:8080: connect: connection refused" Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.945161 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:12:02 crc kubenswrapper[4906]: I0221 00:12:02.970873 4906 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b216710a-7c56-4aa2-8e44-3841aed148cd" Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.945382 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.948782 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"81a595bc9c049855f3e5f6fcfe38c5e45972e4fcb336902465b8b972786ffc58"} Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.948885 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.949007 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.949040 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.951046 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h4zrd_951dfb31-b021-468b-bbae-cd6077c0cfdd/marketplace-operator/0.log" Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.951120 4906 generic.go:334] "Generic (PLEG): container finished" podID="951dfb31-b021-468b-bbae-cd6077c0cfdd" containerID="7abc0a17b918ca08159d4edbf9db923fb1e233390643ed9428074899c43b8066" exitCode=1 Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.951152 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" event={"ID":"951dfb31-b021-468b-bbae-cd6077c0cfdd","Type":"ContainerDied","Data":"7abc0a17b918ca08159d4edbf9db923fb1e233390643ed9428074899c43b8066"} Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.951873 4906 scope.go:117] "RemoveContainer" containerID="7abc0a17b918ca08159d4edbf9db923fb1e233390643ed9428074899c43b8066" Feb 21 00:12:03 crc kubenswrapper[4906]: I0221 00:12:03.952314 4906 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b216710a-7c56-4aa2-8e44-3841aed148cd" Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.959305 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h4zrd_951dfb31-b021-468b-bbae-cd6077c0cfdd/marketplace-operator/1.log" Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.960715 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h4zrd_951dfb31-b021-468b-bbae-cd6077c0cfdd/marketplace-operator/0.log" Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.960813 4906 generic.go:334] "Generic (PLEG): container finished" podID="951dfb31-b021-468b-bbae-cd6077c0cfdd" containerID="cfc64b31664c37d2d113d39840ecafe99769d99b1d99050c1768a999b942f365" exitCode=1 Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.960874 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" event={"ID":"951dfb31-b021-468b-bbae-cd6077c0cfdd","Type":"ContainerDied","Data":"cfc64b31664c37d2d113d39840ecafe99769d99b1d99050c1768a999b942f365"} Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.960945 4906 scope.go:117] "RemoveContainer" containerID="7abc0a17b918ca08159d4edbf9db923fb1e233390643ed9428074899c43b8066" Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.961370 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.961414 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.961479 4906 scope.go:117] "RemoveContainer" containerID="cfc64b31664c37d2d113d39840ecafe99769d99b1d99050c1768a999b942f365" Feb 21 00:12:04 crc kubenswrapper[4906]: E0221 00:12:04.961979 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-h4zrd_openshift-marketplace(951dfb31-b021-468b-bbae-cd6077c0cfdd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" podUID="951dfb31-b021-468b-bbae-cd6077c0cfdd" Feb 21 00:12:04 crc kubenswrapper[4906]: I0221 00:12:04.986919 4906 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b216710a-7c56-4aa2-8e44-3841aed148cd" Feb 21 00:12:05 crc kubenswrapper[4906]: I0221 00:12:05.969497 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h4zrd_951dfb31-b021-468b-bbae-cd6077c0cfdd/marketplace-operator/1.log" Feb 21 00:12:05 crc kubenswrapper[4906]: I0221 00:12:05.970202 4906 scope.go:117] "RemoveContainer" containerID="cfc64b31664c37d2d113d39840ecafe99769d99b1d99050c1768a999b942f365" Feb 21 00:12:05 crc kubenswrapper[4906]: E0221 00:12:05.970589 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-h4zrd_openshift-marketplace(951dfb31-b021-468b-bbae-cd6077c0cfdd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" podUID="951dfb31-b021-468b-bbae-cd6077c0cfdd" Feb 21 00:12:07 crc kubenswrapper[4906]: I0221 00:12:07.175384 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:12:07 crc kubenswrapper[4906]: I0221 00:12:07.182792 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:12:11 crc kubenswrapper[4906]: I0221 00:12:11.531103 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 00:12:12 crc kubenswrapper[4906]: I0221 00:12:12.664206 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:12:12 crc kubenswrapper[4906]: I0221 00:12:12.664599 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:12:12 crc kubenswrapper[4906]: I0221 00:12:12.665295 4906 scope.go:117] "RemoveContainer" containerID="cfc64b31664c37d2d113d39840ecafe99769d99b1d99050c1768a999b942f365" Feb 21 00:12:12 crc kubenswrapper[4906]: E0221 00:12:12.665602 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-h4zrd_openshift-marketplace(951dfb31-b021-468b-bbae-cd6077c0cfdd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" podUID="951dfb31-b021-468b-bbae-cd6077c0cfdd" Feb 21 00:12:12 crc kubenswrapper[4906]: I0221 00:12:12.928566 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 21 00:12:13 crc kubenswrapper[4906]: I0221 00:12:13.458614 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 21 00:12:13 crc kubenswrapper[4906]: I0221 00:12:13.585212 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 21 00:12:13 crc kubenswrapper[4906]: I0221 00:12:13.801313 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 21 00:12:14 crc kubenswrapper[4906]: I0221 00:12:14.222622 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 21 00:12:14 crc kubenswrapper[4906]: I0221 00:12:14.225509 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 21 00:12:14 crc kubenswrapper[4906]: I0221 00:12:14.415406 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 21 00:12:14 crc kubenswrapper[4906]: I0221 00:12:14.485229 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 21 00:12:14 crc kubenswrapper[4906]: I0221 00:12:14.988124 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 21 00:12:15 crc kubenswrapper[4906]: I0221 00:12:15.022752 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 21 00:12:15 crc kubenswrapper[4906]: I0221 00:12:15.259588 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 21 00:12:15 crc kubenswrapper[4906]: I0221 00:12:15.502128 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 21 00:12:15 crc kubenswrapper[4906]: I0221 00:12:15.753210 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 21 00:12:15 crc kubenswrapper[4906]: I0221 00:12:15.942368 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 21 00:12:15 crc kubenswrapper[4906]: I0221 00:12:15.989260 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.093342 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.129880 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.327474 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.384347 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.430963 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.457799 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.526308 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.591345 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.958605 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 21 00:12:16 crc kubenswrapper[4906]: I0221 00:12:16.987930 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.149160 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.302650 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.363436 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.460562 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.511628 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.745816 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.765238 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.824193 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 21 00:12:17 crc kubenswrapper[4906]: I0221 00:12:17.950169 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.047095 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.090549 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.260790 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.282127 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.372483 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.394813 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.507509 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.529170 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.535034 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.539673 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.552520 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.725626 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.894890 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.901948 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.920516 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.961266 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.961528 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 21 00:12:18 crc kubenswrapper[4906]: I0221 00:12:18.996071 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.011257 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.054780 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.062451 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.066118 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.072693 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.176152 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.248826 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.259715 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.266984 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.280590 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.379575 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.511769 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.540141 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.569642 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.578444 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.628320 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.837414 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.847287 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.866558 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.870320 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.875754 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.876555 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.898056 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 21 00:12:19 crc kubenswrapper[4906]: I0221 00:12:19.991436 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.027764 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.028941 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.051589 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.088229 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.131501 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.133796 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.142608 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.311422 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.419827 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.476153 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.695587 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.712409 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.720335 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.755979 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.781099 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.890808 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.967941 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 21 00:12:20 crc kubenswrapper[4906]: I0221 00:12:20.988048 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.038083 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.125132 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.158569 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.199185 4906 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.251525 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.538927 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.541910 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.575414 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.645851 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.695200 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.735526 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.740785 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.775573 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.785331 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.840002 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.871615 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.880484 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.887617 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.891410 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.912898 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.968204 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 21 00:12:21 crc kubenswrapper[4906]: I0221 00:12:21.990058 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.014119 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.037502 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.083625 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.094676 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.148915 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.263567 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.414835 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.607893 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.619415 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.674290 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.752118 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.836124 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.937318 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.948796 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 21 00:12:22 crc kubenswrapper[4906]: I0221 00:12:22.962187 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.014062 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.117759 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.184968 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.282303 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.322731 4906 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.333483 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.376657 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.427617 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.456059 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.592630 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.611390 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.612173 4906 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.680357 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.722483 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.765112 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.820303 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.857391 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.874436 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 21 00:12:23 crc kubenswrapper[4906]: I0221 00:12:23.879930 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.082295 4906 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.085309 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.086556 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zf44m","openshift-marketplace/redhat-operators-bpdlr","openshift-marketplace/marketplace-operator-79b997595-fm5j7","openshift-marketplace/community-operators-ptwns","openshift-marketplace/redhat-marketplace-5mlvx","openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.086733 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.086811 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4zrd"] Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.087060 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.087089 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8cf02d3-2a07-464d-b75f-8d3ad8374553" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.087608 4906 scope.go:117] "RemoveContainer" containerID="cfc64b31664c37d2d113d39840ecafe99769d99b1d99050c1768a999b942f365" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.090416 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.107767 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.107746942 podStartE2EDuration="22.107746942s" podCreationTimestamp="2026-02-21 00:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:12:24.105327108 +0000 UTC m=+279.356914634" watchObservedRunningTime="2026-02-21 00:12:24.107746942 +0000 UTC m=+279.359334448" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.111236 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.128464 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.142340 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.180946 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.220317 4906 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.220497 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.252239 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.320567 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.334103 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.396416 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.417596 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.430578 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.528233 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.572602 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.574284 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.649092 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.721717 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.812154 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.834539 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.877952 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.892411 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.904833 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.907789 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 21 00:12:24 crc kubenswrapper[4906]: I0221 00:12:24.934897 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.046600 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.049292 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.082094 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h4zrd_951dfb31-b021-468b-bbae-cd6077c0cfdd/marketplace-operator/1.log" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.083057 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" event={"ID":"951dfb31-b021-468b-bbae-cd6077c0cfdd","Type":"ContainerStarted","Data":"a40fdabd13fa387476102934cc08b2c34c1a96f889643bb44968676545de05de"} Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.084909 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.086139 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.099880 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-h4zrd" podStartSLOduration=43.099862732 podStartE2EDuration="43.099862732s" podCreationTimestamp="2026-02-21 00:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:12:03.005202507 +0000 UTC m=+258.256790023" watchObservedRunningTime="2026-02-21 00:12:25.099862732 +0000 UTC m=+280.351450238" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.116433 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.123760 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.214178 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.226399 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.240308 4906 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.240728 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561" gracePeriod=5 Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.261498 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.289880 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.319265 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.376427 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.522641 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.524531 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06df1df6-9fbb-40ed-a746-f90f10d5286f" path="/var/lib/kubelet/pods/06df1df6-9fbb-40ed-a746-f90f10d5286f/volumes" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.525301 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c851d93-c82d-4d73-b456-47c8fa6e3f5d" path="/var/lib/kubelet/pods/0c851d93-c82d-4d73-b456-47c8fa6e3f5d/volumes" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.525984 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e" path="/var/lib/kubelet/pods/16b5f9cd-19b1-42ca-b24e-cab3ae7a0b6e/volumes" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.527005 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac4beba-19e0-4278-9701-d830f33d6688" path="/var/lib/kubelet/pods/2ac4beba-19e0-4278-9701-d830f33d6688/volumes" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.527575 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cdeb59-eeb8-4641-921e-05a840fd96fa" path="/var/lib/kubelet/pods/e2cdeb59-eeb8-4641-921e-05a840fd96fa/volumes" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.566056 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.574312 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.589080 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.598916 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.632096 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.659261 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.673278 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.705186 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.714897 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.727261 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 21 00:12:25 crc kubenswrapper[4906]: I0221 00:12:25.836172 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.016836 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.090314 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.117773 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.132854 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.171419 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.293116 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.337223 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.459381 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.594031 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.651006 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.720704 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 21 00:12:26 crc kubenswrapper[4906]: I0221 00:12:26.945808 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.004514 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.039472 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.138983 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.289337 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.292283 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.317140 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.371634 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.394344 4906 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.408075 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.419453 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.588023 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.627635 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.669738 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.749193 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.777504 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.811745 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.881597 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.982205 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 21 00:12:27 crc kubenswrapper[4906]: I0221 00:12:27.997465 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.042366 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.143050 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.409976 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.425275 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.469027 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.622235 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.755837 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.800116 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.932141 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 21 00:12:28 crc kubenswrapper[4906]: I0221 00:12:28.968835 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 21 00:12:29 crc kubenswrapper[4906]: I0221 00:12:29.117183 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 21 00:12:29 crc kubenswrapper[4906]: I0221 00:12:29.182412 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 21 00:12:29 crc kubenswrapper[4906]: I0221 00:12:29.426805 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 21 00:12:29 crc kubenswrapper[4906]: I0221 00:12:29.765168 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.058880 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.144884 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.342631 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.782646 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.813204 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.813292 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934344 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934499 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934517 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934604 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934669 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934739 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934778 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934844 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.934909 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.935080 4906 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.935097 4906 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.935109 4906 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.935120 4906 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:30 crc kubenswrapper[4906]: I0221 00:12:30.947922 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.036466 4906 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.115255 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.115305 4906 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561" exitCode=137 Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.115411 4906 scope.go:117] "RemoveContainer" containerID="caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561" Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.115535 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.137441 4906 scope.go:117] "RemoveContainer" containerID="caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561" Feb 21 00:12:31 crc kubenswrapper[4906]: E0221 00:12:31.137977 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561\": container with ID starting with caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561 not found: ID does not exist" containerID="caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561" Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.138029 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561"} err="failed to get container status \"caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561\": rpc error: code = NotFound desc = could not find container \"caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561\": container with ID starting with caa57747c64ded975f0d638b7290670f0f59696d2b56618a3e32b20cbd3c6561 not found: ID does not exist" Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.193361 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 21 00:12:31 crc kubenswrapper[4906]: I0221 00:12:31.522318 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 21 00:12:45 crc kubenswrapper[4906]: I0221 00:12:45.314599 4906 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 21 00:12:49 crc kubenswrapper[4906]: I0221 00:12:49.574024 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bp67z"] Feb 21 00:12:49 crc kubenswrapper[4906]: I0221 00:12:49.574601 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" podUID="f51d9e17-fed2-4d4a-aeab-8b135b6222fb" containerName="controller-manager" containerID="cri-o://173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f" gracePeriod=30 Feb 21 00:12:49 crc kubenswrapper[4906]: I0221 00:12:49.684235 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj"] Feb 21 00:12:49 crc kubenswrapper[4906]: I0221 00:12:49.684599 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" podUID="e0d315ba-d07b-4043-ba77-bf8859d16638" containerName="route-controller-manager" containerID="cri-o://c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35" gracePeriod=30 Feb 21 00:12:49 crc kubenswrapper[4906]: I0221 00:12:49.991298 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.023540 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.083590 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-config\") pod \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.083742 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls2rr\" (UniqueName: \"kubernetes.io/projected/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-kube-api-access-ls2rr\") pod \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.083823 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-serving-cert\") pod \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.083869 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-proxy-ca-bundles\") pod \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.083896 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-client-ca\") pod \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\" (UID: \"f51d9e17-fed2-4d4a-aeab-8b135b6222fb\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.084611 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-config" (OuterVolumeSpecName: "config") pod "f51d9e17-fed2-4d4a-aeab-8b135b6222fb" (UID: "f51d9e17-fed2-4d4a-aeab-8b135b6222fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.084632 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f51d9e17-fed2-4d4a-aeab-8b135b6222fb" (UID: "f51d9e17-fed2-4d4a-aeab-8b135b6222fb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.085174 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "f51d9e17-fed2-4d4a-aeab-8b135b6222fb" (UID: "f51d9e17-fed2-4d4a-aeab-8b135b6222fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.085457 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.085482 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.085494 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.089137 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f51d9e17-fed2-4d4a-aeab-8b135b6222fb" (UID: "f51d9e17-fed2-4d4a-aeab-8b135b6222fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.091006 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-kube-api-access-ls2rr" (OuterVolumeSpecName: "kube-api-access-ls2rr") pod "f51d9e17-fed2-4d4a-aeab-8b135b6222fb" (UID: "f51d9e17-fed2-4d4a-aeab-8b135b6222fb"). InnerVolumeSpecName "kube-api-access-ls2rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.186325 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d315ba-d07b-4043-ba77-bf8859d16638-serving-cert\") pod \"e0d315ba-d07b-4043-ba77-bf8859d16638\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.186433 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-config\") pod \"e0d315ba-d07b-4043-ba77-bf8859d16638\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.186528 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-client-ca\") pod \"e0d315ba-d07b-4043-ba77-bf8859d16638\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.186562 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2jnz\" (UniqueName: \"kubernetes.io/projected/e0d315ba-d07b-4043-ba77-bf8859d16638-kube-api-access-k2jnz\") pod \"e0d315ba-d07b-4043-ba77-bf8859d16638\" (UID: \"e0d315ba-d07b-4043-ba77-bf8859d16638\") " Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.186821 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls2rr\" (UniqueName: \"kubernetes.io/projected/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-kube-api-access-ls2rr\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.186840 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f51d9e17-fed2-4d4a-aeab-8b135b6222fb-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.187986 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-client-ca" (OuterVolumeSpecName: "client-ca") pod "e0d315ba-d07b-4043-ba77-bf8859d16638" (UID: "e0d315ba-d07b-4043-ba77-bf8859d16638"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.188085 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-config" (OuterVolumeSpecName: "config") pod "e0d315ba-d07b-4043-ba77-bf8859d16638" (UID: "e0d315ba-d07b-4043-ba77-bf8859d16638"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.189834 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0d315ba-d07b-4043-ba77-bf8859d16638-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e0d315ba-d07b-4043-ba77-bf8859d16638" (UID: "e0d315ba-d07b-4043-ba77-bf8859d16638"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.190209 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d315ba-d07b-4043-ba77-bf8859d16638-kube-api-access-k2jnz" (OuterVolumeSpecName: "kube-api-access-k2jnz") pod "e0d315ba-d07b-4043-ba77-bf8859d16638" (UID: "e0d315ba-d07b-4043-ba77-bf8859d16638"). InnerVolumeSpecName "kube-api-access-k2jnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.215949 4906 generic.go:334] "Generic (PLEG): container finished" podID="f51d9e17-fed2-4d4a-aeab-8b135b6222fb" containerID="173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f" exitCode=0 Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.216070 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" event={"ID":"f51d9e17-fed2-4d4a-aeab-8b135b6222fb","Type":"ContainerDied","Data":"173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f"} Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.216107 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" event={"ID":"f51d9e17-fed2-4d4a-aeab-8b135b6222fb","Type":"ContainerDied","Data":"adab02a6f226723c45a0e4badc16e4e4d2148e7aef61fdf16f7225de03513e8c"} Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.216106 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bp67z" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.216129 4906 scope.go:117] "RemoveContainer" containerID="173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.221297 4906 generic.go:334] "Generic (PLEG): container finished" podID="e0d315ba-d07b-4043-ba77-bf8859d16638" containerID="c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35" exitCode=0 Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.221363 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.221370 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" event={"ID":"e0d315ba-d07b-4043-ba77-bf8859d16638","Type":"ContainerDied","Data":"c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35"} Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.221529 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj" event={"ID":"e0d315ba-d07b-4043-ba77-bf8859d16638","Type":"ContainerDied","Data":"bf6e909d5f812860333cffe3579cd718aa224edd508444e4957465eb22e814c3"} Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.249202 4906 scope.go:117] "RemoveContainer" containerID="173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f" Feb 21 00:12:50 crc kubenswrapper[4906]: E0221 00:12:50.249841 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f\": container with ID starting with 173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f not found: ID does not exist" containerID="173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.249896 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f"} err="failed to get container status \"173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f\": rpc error: code = NotFound desc = could not find container \"173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f\": container with ID starting with 173a8bc327f1bfe41fe3d9c7e3d8dbd06f569d51a09273f20af540655056fc5f not found: ID does not exist" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.249942 4906 scope.go:117] "RemoveContainer" containerID="c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.281020 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj"] Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.290152 4906 scope.go:117] "RemoveContainer" containerID="c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.290383 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.290403 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2jnz\" (UniqueName: \"kubernetes.io/projected/e0d315ba-d07b-4043-ba77-bf8859d16638-kube-api-access-k2jnz\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.290412 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d315ba-d07b-4043-ba77-bf8859d16638-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.290421 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d315ba-d07b-4043-ba77-bf8859d16638-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:12:50 crc kubenswrapper[4906]: E0221 00:12:50.291014 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35\": container with ID starting with c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35 not found: ID does not exist" containerID="c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.291041 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35"} err="failed to get container status \"c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35\": rpc error: code = NotFound desc = could not find container \"c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35\": container with ID starting with c1b63c6130c7bc60070183886568804c802944d364933b663c1fc03d4a3e9e35 not found: ID does not exist" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.292675 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w4fcj"] Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.301560 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bp67z"] Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.305162 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bp67z"] Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.934297 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9"] Feb 21 00:12:50 crc kubenswrapper[4906]: E0221 00:12:50.934959 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.934977 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 21 00:12:50 crc kubenswrapper[4906]: E0221 00:12:50.934989 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" containerName="installer" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.934999 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" containerName="installer" Feb 21 00:12:50 crc kubenswrapper[4906]: E0221 00:12:50.935015 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d315ba-d07b-4043-ba77-bf8859d16638" containerName="route-controller-manager" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.935023 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d315ba-d07b-4043-ba77-bf8859d16638" containerName="route-controller-manager" Feb 21 00:12:50 crc kubenswrapper[4906]: E0221 00:12:50.935040 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51d9e17-fed2-4d4a-aeab-8b135b6222fb" containerName="controller-manager" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.935048 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51d9e17-fed2-4d4a-aeab-8b135b6222fb" containerName="controller-manager" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.935144 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f51d9e17-fed2-4d4a-aeab-8b135b6222fb" containerName="controller-manager" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.935161 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.935174 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="03720e31-8028-44b3-9bc7-54cb0474a821" containerName="installer" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.935191 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d315ba-d07b-4043-ba77-bf8859d16638" containerName="route-controller-manager" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.935749 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.939116 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.940406 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.940888 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f975d5594-lbftw"] Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.941925 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.948313 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.949011 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.949408 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.950407 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.950639 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.951367 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.951565 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.951923 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.952523 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.952579 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.957120 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9"] Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.959365 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 21 00:12:50 crc kubenswrapper[4906]: I0221 00:12:50.961578 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f975d5594-lbftw"] Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103012 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d82a125e-74b3-4ade-b7df-2c67500a5d58-client-ca\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103069 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82a125e-74b3-4ade-b7df-2c67500a5d58-serving-cert\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103192 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zn2x\" (UniqueName: \"kubernetes.io/projected/a3363856-b982-4c14-bc21-d76db47e5e1e-kube-api-access-6zn2x\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103235 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-client-ca\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103342 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-config\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103375 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3363856-b982-4c14-bc21-d76db47e5e1e-serving-cert\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103427 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-proxy-ca-bundles\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103449 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82a125e-74b3-4ade-b7df-2c67500a5d58-config\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.103475 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsb6s\" (UniqueName: \"kubernetes.io/projected/d82a125e-74b3-4ade-b7df-2c67500a5d58-kube-api-access-rsb6s\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.204426 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82a125e-74b3-4ade-b7df-2c67500a5d58-config\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.205834 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-proxy-ca-bundles\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.205929 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsb6s\" (UniqueName: \"kubernetes.io/projected/d82a125e-74b3-4ade-b7df-2c67500a5d58-kube-api-access-rsb6s\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.206084 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d82a125e-74b3-4ade-b7df-2c67500a5d58-client-ca\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.206176 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82a125e-74b3-4ade-b7df-2c67500a5d58-serving-cert\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.206260 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zn2x\" (UniqueName: \"kubernetes.io/projected/a3363856-b982-4c14-bc21-d76db47e5e1e-kube-api-access-6zn2x\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.206328 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-client-ca\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.206503 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-config\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.206569 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3363856-b982-4c14-bc21-d76db47e5e1e-serving-cert\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.206261 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82a125e-74b3-4ade-b7df-2c67500a5d58-config\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.206956 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d82a125e-74b3-4ade-b7df-2c67500a5d58-client-ca\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.207302 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-proxy-ca-bundles\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.207843 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-client-ca\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.211758 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-config\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.213204 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3363856-b982-4c14-bc21-d76db47e5e1e-serving-cert\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.213226 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82a125e-74b3-4ade-b7df-2c67500a5d58-serving-cert\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.238778 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsb6s\" (UniqueName: \"kubernetes.io/projected/d82a125e-74b3-4ade-b7df-2c67500a5d58-kube-api-access-rsb6s\") pod \"route-controller-manager-69584597c4-vjcw9\" (UID: \"d82a125e-74b3-4ade-b7df-2c67500a5d58\") " pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.248293 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zn2x\" (UniqueName: \"kubernetes.io/projected/a3363856-b982-4c14-bc21-d76db47e5e1e-kube-api-access-6zn2x\") pod \"controller-manager-7f975d5594-lbftw\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.304543 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.320526 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.525789 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d315ba-d07b-4043-ba77-bf8859d16638" path="/var/lib/kubelet/pods/e0d315ba-d07b-4043-ba77-bf8859d16638/volumes" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.526945 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f51d9e17-fed2-4d4a-aeab-8b135b6222fb" path="/var/lib/kubelet/pods/f51d9e17-fed2-4d4a-aeab-8b135b6222fb/volumes" Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.548144 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9"] Feb 21 00:12:51 crc kubenswrapper[4906]: I0221 00:12:51.584573 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f975d5594-lbftw"] Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.252024 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" event={"ID":"d82a125e-74b3-4ade-b7df-2c67500a5d58","Type":"ContainerStarted","Data":"567d6dfd8e12855f81cf14a30992865c90b56f00676dcd379f7da26096e135a1"} Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.252388 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" event={"ID":"d82a125e-74b3-4ade-b7df-2c67500a5d58","Type":"ContainerStarted","Data":"ee511a120660c703c5eff1c8a2e6d16addbdbc9b5eeff3c9712d3819d8be626e"} Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.252408 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.254622 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" event={"ID":"a3363856-b982-4c14-bc21-d76db47e5e1e","Type":"ContainerStarted","Data":"e0542cf118b729e6ec235f8b7e6e7af6de7470670c2f310c36f72e630b6bad34"} Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.254655 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" event={"ID":"a3363856-b982-4c14-bc21-d76db47e5e1e","Type":"ContainerStarted","Data":"7a763b99789fbcc70978c81a7ad8ba355edbbbb3de443f10d7122005ceb7eeda"} Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.254869 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.258921 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.269769 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" podStartSLOduration=3.269750241 podStartE2EDuration="3.269750241s" podCreationTimestamp="2026-02-21 00:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:12:52.269170934 +0000 UTC m=+307.520758450" watchObservedRunningTime="2026-02-21 00:12:52.269750241 +0000 UTC m=+307.521337757" Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.460232 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69584597c4-vjcw9" Feb 21 00:12:52 crc kubenswrapper[4906]: I0221 00:12:52.478118 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" podStartSLOduration=3.478096388 podStartE2EDuration="3.478096388s" podCreationTimestamp="2026-02-21 00:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:12:52.288821872 +0000 UTC m=+307.540409398" watchObservedRunningTime="2026-02-21 00:12:52.478096388 +0000 UTC m=+307.729683904" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.597319 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6flp8"] Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.599067 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.606716 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6flp8"] Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.698336 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.698385 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-trusted-ca\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.698441 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-bound-sa-token\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.698465 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.698480 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.698495 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bthm9\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-kube-api-access-bthm9\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.698512 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-registry-certificates\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.698530 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-registry-tls\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.721343 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.799833 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-bound-sa-token\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.799907 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.799943 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.799978 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bthm9\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-kube-api-access-bthm9\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.800009 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-registry-certificates\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.800040 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-registry-tls\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.800088 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-trusted-ca\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.801331 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.801874 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-registry-certificates\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.801994 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-trusted-ca\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.807324 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-registry-tls\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.816410 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.828325 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-bound-sa-token\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.836316 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bthm9\" (UniqueName: \"kubernetes.io/projected/50fcdcbd-5c60-4516-a7ba-f4347623a3c2-kube-api-access-bthm9\") pod \"image-registry-66df7c8f76-6flp8\" (UID: \"50fcdcbd-5c60-4516-a7ba-f4347623a3c2\") " pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:12 crc kubenswrapper[4906]: I0221 00:13:12.939941 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:13 crc kubenswrapper[4906]: I0221 00:13:13.339836 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6flp8"] Feb 21 00:13:13 crc kubenswrapper[4906]: I0221 00:13:13.378902 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" event={"ID":"50fcdcbd-5c60-4516-a7ba-f4347623a3c2","Type":"ContainerStarted","Data":"24075c7489c07c3232f775a7e18827f20a2c887217bff7c925ec800346c11ff3"} Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.388892 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" event={"ID":"50fcdcbd-5c60-4516-a7ba-f4347623a3c2","Type":"ContainerStarted","Data":"51aefe30d8d8b5ba2d822fb69161d8cc6c0804917c65a1a655ada536b576ec8f"} Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.389651 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.427426 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" podStartSLOduration=2.427406232 podStartE2EDuration="2.427406232s" podCreationTimestamp="2026-02-21 00:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:13:14.417423148 +0000 UTC m=+329.669010724" watchObservedRunningTime="2026-02-21 00:13:14.427406232 +0000 UTC m=+329.678993748" Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.868548 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h7vwr"] Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.870166 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.872891 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.882569 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7vwr"] Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.933632 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-catalog-content\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.933697 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj6cm\" (UniqueName: \"kubernetes.io/projected/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-kube-api-access-cj6cm\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:14 crc kubenswrapper[4906]: I0221 00:13:14.933785 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-utilities\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.034824 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj6cm\" (UniqueName: \"kubernetes.io/projected/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-kube-api-access-cj6cm\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.035001 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-utilities\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.035084 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-catalog-content\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.035605 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-utilities\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.035774 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-catalog-content\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.059436 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj6cm\" (UniqueName: \"kubernetes.io/projected/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-kube-api-access-cj6cm\") pod \"redhat-marketplace-h7vwr\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.230045 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.473231 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q75mj"] Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.475650 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.480418 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q75mj"] Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.483881 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.543702 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af47c9fc-f15a-4557-bbbb-8426961d3fad-catalog-content\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.544024 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szpmz\" (UniqueName: \"kubernetes.io/projected/af47c9fc-f15a-4557-bbbb-8426961d3fad-kube-api-access-szpmz\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.544129 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af47c9fc-f15a-4557-bbbb-8426961d3fad-utilities\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.637514 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7vwr"] Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.647355 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af47c9fc-f15a-4557-bbbb-8426961d3fad-utilities\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.648419 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af47c9fc-f15a-4557-bbbb-8426961d3fad-catalog-content\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.648878 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af47c9fc-f15a-4557-bbbb-8426961d3fad-utilities\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.647499 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af47c9fc-f15a-4557-bbbb-8426961d3fad-catalog-content\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.648991 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szpmz\" (UniqueName: \"kubernetes.io/projected/af47c9fc-f15a-4557-bbbb-8426961d3fad-kube-api-access-szpmz\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: W0221 00:13:15.655153 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157d9b34_7ca3_40a2_8aae_4eeab9a53ef3.slice/crio-896fd9cbdefda09e1f49abc5aa94712d011cecba655317b2c522078fce744796 WatchSource:0}: Error finding container 896fd9cbdefda09e1f49abc5aa94712d011cecba655317b2c522078fce744796: Status 404 returned error can't find the container with id 896fd9cbdefda09e1f49abc5aa94712d011cecba655317b2c522078fce744796 Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.670125 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szpmz\" (UniqueName: \"kubernetes.io/projected/af47c9fc-f15a-4557-bbbb-8426961d3fad-kube-api-access-szpmz\") pod \"redhat-operators-q75mj\" (UID: \"af47c9fc-f15a-4557-bbbb-8426961d3fad\") " pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:15 crc kubenswrapper[4906]: I0221 00:13:15.813347 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:16 crc kubenswrapper[4906]: I0221 00:13:16.261779 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q75mj"] Feb 21 00:13:16 crc kubenswrapper[4906]: I0221 00:13:16.403335 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q75mj" event={"ID":"af47c9fc-f15a-4557-bbbb-8426961d3fad","Type":"ContainerStarted","Data":"19d8c4ecb762d62528e5a608f37d383d3e0458195257d4cff900a18b1bb865f6"} Feb 21 00:13:16 crc kubenswrapper[4906]: I0221 00:13:16.404934 4906 generic.go:334] "Generic (PLEG): container finished" podID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerID="3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84" exitCode=0 Feb 21 00:13:16 crc kubenswrapper[4906]: I0221 00:13:16.404994 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7vwr" event={"ID":"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3","Type":"ContainerDied","Data":"3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84"} Feb 21 00:13:16 crc kubenswrapper[4906]: I0221 00:13:16.405014 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7vwr" event={"ID":"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3","Type":"ContainerStarted","Data":"896fd9cbdefda09e1f49abc5aa94712d011cecba655317b2c522078fce744796"} Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.267418 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hxzlw"] Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.270116 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.275501 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.277307 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hxzlw"] Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.372660 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3554650e-6043-4de9-819e-b4f063f7414e-catalog-content\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.372831 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3554650e-6043-4de9-819e-b4f063f7414e-utilities\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.372929 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4qtv\" (UniqueName: \"kubernetes.io/projected/3554650e-6043-4de9-819e-b4f063f7414e-kube-api-access-h4qtv\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.416595 4906 generic.go:334] "Generic (PLEG): container finished" podID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerID="5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1" exitCode=0 Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.416799 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7vwr" event={"ID":"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3","Type":"ContainerDied","Data":"5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1"} Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.421642 4906 generic.go:334] "Generic (PLEG): container finished" podID="af47c9fc-f15a-4557-bbbb-8426961d3fad" containerID="454a51c7186591b275b97b601b9f7ef7e18a10265303ccb08be423ebd06e0af3" exitCode=0 Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.421708 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q75mj" event={"ID":"af47c9fc-f15a-4557-bbbb-8426961d3fad","Type":"ContainerDied","Data":"454a51c7186591b275b97b601b9f7ef7e18a10265303ccb08be423ebd06e0af3"} Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.473960 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3554650e-6043-4de9-819e-b4f063f7414e-utilities\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.474011 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4qtv\" (UniqueName: \"kubernetes.io/projected/3554650e-6043-4de9-819e-b4f063f7414e-kube-api-access-h4qtv\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.474077 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3554650e-6043-4de9-819e-b4f063f7414e-catalog-content\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.474916 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3554650e-6043-4de9-819e-b4f063f7414e-catalog-content\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.475065 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3554650e-6043-4de9-819e-b4f063f7414e-utilities\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.496069 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4qtv\" (UniqueName: \"kubernetes.io/projected/3554650e-6043-4de9-819e-b4f063f7414e-kube-api-access-h4qtv\") pod \"community-operators-hxzlw\" (UID: \"3554650e-6043-4de9-819e-b4f063f7414e\") " pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.636038 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.864024 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w7db6"] Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.865273 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.867672 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.879917 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w7db6"] Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.981117 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b5471c-7b90-4839-91c4-31e5f98c3959-utilities\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.981204 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn9zz\" (UniqueName: \"kubernetes.io/projected/19b5471c-7b90-4839-91c4-31e5f98c3959-kube-api-access-tn9zz\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:17 crc kubenswrapper[4906]: I0221 00:13:17.981268 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b5471c-7b90-4839-91c4-31e5f98c3959-catalog-content\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.074410 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hxzlw"] Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.082244 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b5471c-7b90-4839-91c4-31e5f98c3959-utilities\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.082308 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn9zz\" (UniqueName: \"kubernetes.io/projected/19b5471c-7b90-4839-91c4-31e5f98c3959-kube-api-access-tn9zz\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.082369 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b5471c-7b90-4839-91c4-31e5f98c3959-catalog-content\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.082865 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b5471c-7b90-4839-91c4-31e5f98c3959-catalog-content\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.083191 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b5471c-7b90-4839-91c4-31e5f98c3959-utilities\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.105324 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn9zz\" (UniqueName: \"kubernetes.io/projected/19b5471c-7b90-4839-91c4-31e5f98c3959-kube-api-access-tn9zz\") pod \"certified-operators-w7db6\" (UID: \"19b5471c-7b90-4839-91c4-31e5f98c3959\") " pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.182091 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.432216 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7vwr" event={"ID":"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3","Type":"ContainerStarted","Data":"6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37"} Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.436273 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q75mj" event={"ID":"af47c9fc-f15a-4557-bbbb-8426961d3fad","Type":"ContainerStarted","Data":"e336f2ce708781e2c6d815b5082da392eca4288b3fbf3186fcf53c14297fd410"} Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.437876 4906 generic.go:334] "Generic (PLEG): container finished" podID="3554650e-6043-4de9-819e-b4f063f7414e" containerID="3a4059d9ac1cb24f4b3757ebc96a901194cd102bce9ff85ede17b192b1b7e3dc" exitCode=0 Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.437915 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxzlw" event={"ID":"3554650e-6043-4de9-819e-b4f063f7414e","Type":"ContainerDied","Data":"3a4059d9ac1cb24f4b3757ebc96a901194cd102bce9ff85ede17b192b1b7e3dc"} Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.437959 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxzlw" event={"ID":"3554650e-6043-4de9-819e-b4f063f7414e","Type":"ContainerStarted","Data":"e4480ca5a32faadd0a05a954fcb3eba024552fccc4bb58b397f9be0a229ae4f3"} Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.450623 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h7vwr" podStartSLOduration=2.9891381150000003 podStartE2EDuration="4.450602773s" podCreationTimestamp="2026-02-21 00:13:14 +0000 UTC" firstStartedPulling="2026-02-21 00:13:16.407260381 +0000 UTC m=+331.658847897" lastFinishedPulling="2026-02-21 00:13:17.868725049 +0000 UTC m=+333.120312555" observedRunningTime="2026-02-21 00:13:18.45018561 +0000 UTC m=+333.701773136" watchObservedRunningTime="2026-02-21 00:13:18.450602773 +0000 UTC m=+333.702190289" Feb 21 00:13:18 crc kubenswrapper[4906]: I0221 00:13:18.611267 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w7db6"] Feb 21 00:13:18 crc kubenswrapper[4906]: W0221 00:13:18.618971 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b5471c_7b90_4839_91c4_31e5f98c3959.slice/crio-0f5d243a7cbff90cf7598b0970f9110a90e430f7835f908244243c2eacdf97e1 WatchSource:0}: Error finding container 0f5d243a7cbff90cf7598b0970f9110a90e430f7835f908244243c2eacdf97e1: Status 404 returned error can't find the container with id 0f5d243a7cbff90cf7598b0970f9110a90e430f7835f908244243c2eacdf97e1 Feb 21 00:13:19 crc kubenswrapper[4906]: I0221 00:13:19.447386 4906 generic.go:334] "Generic (PLEG): container finished" podID="19b5471c-7b90-4839-91c4-31e5f98c3959" containerID="a464212b829c9f6d11b39e5d2c3746d82a686aa7c415b7b51590eba2dc76ff90" exitCode=0 Feb 21 00:13:19 crc kubenswrapper[4906]: I0221 00:13:19.447457 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7db6" event={"ID":"19b5471c-7b90-4839-91c4-31e5f98c3959","Type":"ContainerDied","Data":"a464212b829c9f6d11b39e5d2c3746d82a686aa7c415b7b51590eba2dc76ff90"} Feb 21 00:13:19 crc kubenswrapper[4906]: I0221 00:13:19.447515 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7db6" event={"ID":"19b5471c-7b90-4839-91c4-31e5f98c3959","Type":"ContainerStarted","Data":"0f5d243a7cbff90cf7598b0970f9110a90e430f7835f908244243c2eacdf97e1"} Feb 21 00:13:19 crc kubenswrapper[4906]: I0221 00:13:19.450433 4906 generic.go:334] "Generic (PLEG): container finished" podID="af47c9fc-f15a-4557-bbbb-8426961d3fad" containerID="e336f2ce708781e2c6d815b5082da392eca4288b3fbf3186fcf53c14297fd410" exitCode=0 Feb 21 00:13:19 crc kubenswrapper[4906]: I0221 00:13:19.450503 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q75mj" event={"ID":"af47c9fc-f15a-4557-bbbb-8426961d3fad","Type":"ContainerDied","Data":"e336f2ce708781e2c6d815b5082da392eca4288b3fbf3186fcf53c14297fd410"} Feb 21 00:13:20 crc kubenswrapper[4906]: I0221 00:13:20.459286 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q75mj" event={"ID":"af47c9fc-f15a-4557-bbbb-8426961d3fad","Type":"ContainerStarted","Data":"53e889240b8e53ba1d6773966d5bfbc669b894231f9e9cf97cded4cce1f52cda"} Feb 21 00:13:20 crc kubenswrapper[4906]: I0221 00:13:20.460671 4906 generic.go:334] "Generic (PLEG): container finished" podID="3554650e-6043-4de9-819e-b4f063f7414e" containerID="95b1c1b091ab31e3a8a8300ecf4ddb72c06a755cebb54f5f633b1ade9ddee9fc" exitCode=0 Feb 21 00:13:20 crc kubenswrapper[4906]: I0221 00:13:20.460775 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxzlw" event={"ID":"3554650e-6043-4de9-819e-b4f063f7414e","Type":"ContainerDied","Data":"95b1c1b091ab31e3a8a8300ecf4ddb72c06a755cebb54f5f633b1ade9ddee9fc"} Feb 21 00:13:20 crc kubenswrapper[4906]: I0221 00:13:20.462179 4906 generic.go:334] "Generic (PLEG): container finished" podID="19b5471c-7b90-4839-91c4-31e5f98c3959" containerID="640908f6d680bcc4a0e740798e259b4cf5ccec278aaa021f53007cd3fe2496fd" exitCode=0 Feb 21 00:13:20 crc kubenswrapper[4906]: I0221 00:13:20.462216 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7db6" event={"ID":"19b5471c-7b90-4839-91c4-31e5f98c3959","Type":"ContainerDied","Data":"640908f6d680bcc4a0e740798e259b4cf5ccec278aaa021f53007cd3fe2496fd"} Feb 21 00:13:20 crc kubenswrapper[4906]: I0221 00:13:20.484514 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q75mj" podStartSLOduration=3.074311211 podStartE2EDuration="5.484491998s" podCreationTimestamp="2026-02-21 00:13:15 +0000 UTC" firstStartedPulling="2026-02-21 00:13:17.425643992 +0000 UTC m=+332.677231538" lastFinishedPulling="2026-02-21 00:13:19.835824769 +0000 UTC m=+335.087412325" observedRunningTime="2026-02-21 00:13:20.476378791 +0000 UTC m=+335.727966327" watchObservedRunningTime="2026-02-21 00:13:20.484491998 +0000 UTC m=+335.736079514" Feb 21 00:13:21 crc kubenswrapper[4906]: I0221 00:13:21.469678 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxzlw" event={"ID":"3554650e-6043-4de9-819e-b4f063f7414e","Type":"ContainerStarted","Data":"675634ab14bd7773effcbc5b3841960a6834c8fd7b9f614e38a01acfa88b9829"} Feb 21 00:13:21 crc kubenswrapper[4906]: I0221 00:13:21.472332 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w7db6" event={"ID":"19b5471c-7b90-4839-91c4-31e5f98c3959","Type":"ContainerStarted","Data":"1379324afc431eed5ded3d97405955b8f2b33ac3d1e86a363da12acfa8f66180"} Feb 21 00:13:21 crc kubenswrapper[4906]: I0221 00:13:21.527645 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w7db6" podStartSLOduration=3.085924667 podStartE2EDuration="4.527623923s" podCreationTimestamp="2026-02-21 00:13:17 +0000 UTC" firstStartedPulling="2026-02-21 00:13:19.449171471 +0000 UTC m=+334.700758977" lastFinishedPulling="2026-02-21 00:13:20.890870707 +0000 UTC m=+336.142458233" observedRunningTime="2026-02-21 00:13:21.526029965 +0000 UTC m=+336.777617511" watchObservedRunningTime="2026-02-21 00:13:21.527623923 +0000 UTC m=+336.779211449" Feb 21 00:13:21 crc kubenswrapper[4906]: I0221 00:13:21.528923 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hxzlw" podStartSLOduration=2.048452705 podStartE2EDuration="4.528908733s" podCreationTimestamp="2026-02-21 00:13:17 +0000 UTC" firstStartedPulling="2026-02-21 00:13:18.44098665 +0000 UTC m=+333.692574156" lastFinishedPulling="2026-02-21 00:13:20.921442668 +0000 UTC m=+336.173030184" observedRunningTime="2026-02-21 00:13:21.500212859 +0000 UTC m=+336.751800385" watchObservedRunningTime="2026-02-21 00:13:21.528908733 +0000 UTC m=+336.780496259" Feb 21 00:13:25 crc kubenswrapper[4906]: I0221 00:13:25.230726 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:25 crc kubenswrapper[4906]: I0221 00:13:25.231123 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:25 crc kubenswrapper[4906]: I0221 00:13:25.293463 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:25 crc kubenswrapper[4906]: I0221 00:13:25.542306 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:13:25 crc kubenswrapper[4906]: I0221 00:13:25.814007 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:25 crc kubenswrapper[4906]: I0221 00:13:25.814060 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:26 crc kubenswrapper[4906]: I0221 00:13:26.879344 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q75mj" podUID="af47c9fc-f15a-4557-bbbb-8426961d3fad" containerName="registry-server" probeResult="failure" output=< Feb 21 00:13:26 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Feb 21 00:13:26 crc kubenswrapper[4906]: > Feb 21 00:13:27 crc kubenswrapper[4906]: I0221 00:13:27.636892 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:27 crc kubenswrapper[4906]: I0221 00:13:27.636988 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:27 crc kubenswrapper[4906]: I0221 00:13:27.686600 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:28 crc kubenswrapper[4906]: I0221 00:13:28.182917 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:28 crc kubenswrapper[4906]: I0221 00:13:28.183004 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:28 crc kubenswrapper[4906]: I0221 00:13:28.228387 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:28 crc kubenswrapper[4906]: I0221 00:13:28.577013 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hxzlw" Feb 21 00:13:28 crc kubenswrapper[4906]: I0221 00:13:28.583941 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w7db6" Feb 21 00:13:29 crc kubenswrapper[4906]: I0221 00:13:29.546286 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f975d5594-lbftw"] Feb 21 00:13:29 crc kubenswrapper[4906]: I0221 00:13:29.548807 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" podUID="a3363856-b982-4c14-bc21-d76db47e5e1e" containerName="controller-manager" containerID="cri-o://e0542cf118b729e6ec235f8b7e6e7af6de7470670c2f310c36f72e630b6bad34" gracePeriod=30 Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.532836 4906 generic.go:334] "Generic (PLEG): container finished" podID="a3363856-b982-4c14-bc21-d76db47e5e1e" containerID="e0542cf118b729e6ec235f8b7e6e7af6de7470670c2f310c36f72e630b6bad34" exitCode=0 Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.532896 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" event={"ID":"a3363856-b982-4c14-bc21-d76db47e5e1e","Type":"ContainerDied","Data":"e0542cf118b729e6ec235f8b7e6e7af6de7470670c2f310c36f72e630b6bad34"} Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.633902 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.669663 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-proxy-ca-bundles\") pod \"a3363856-b982-4c14-bc21-d76db47e5e1e\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.669975 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-config\") pod \"a3363856-b982-4c14-bc21-d76db47e5e1e\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.670032 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zn2x\" (UniqueName: \"kubernetes.io/projected/a3363856-b982-4c14-bc21-d76db47e5e1e-kube-api-access-6zn2x\") pod \"a3363856-b982-4c14-bc21-d76db47e5e1e\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.670109 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3363856-b982-4c14-bc21-d76db47e5e1e-serving-cert\") pod \"a3363856-b982-4c14-bc21-d76db47e5e1e\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.670168 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-client-ca\") pod \"a3363856-b982-4c14-bc21-d76db47e5e1e\" (UID: \"a3363856-b982-4c14-bc21-d76db47e5e1e\") " Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.670734 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a3363856-b982-4c14-bc21-d76db47e5e1e" (UID: "a3363856-b982-4c14-bc21-d76db47e5e1e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.670844 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-config" (OuterVolumeSpecName: "config") pod "a3363856-b982-4c14-bc21-d76db47e5e1e" (UID: "a3363856-b982-4c14-bc21-d76db47e5e1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.671243 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-client-ca" (OuterVolumeSpecName: "client-ca") pod "a3363856-b982-4c14-bc21-d76db47e5e1e" (UID: "a3363856-b982-4c14-bc21-d76db47e5e1e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.683787 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3363856-b982-4c14-bc21-d76db47e5e1e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a3363856-b982-4c14-bc21-d76db47e5e1e" (UID: "a3363856-b982-4c14-bc21-d76db47e5e1e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.689821 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3363856-b982-4c14-bc21-d76db47e5e1e-kube-api-access-6zn2x" (OuterVolumeSpecName: "kube-api-access-6zn2x") pod "a3363856-b982-4c14-bc21-d76db47e5e1e" (UID: "a3363856-b982-4c14-bc21-d76db47e5e1e"). InnerVolumeSpecName "kube-api-access-6zn2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.689905 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd"] Feb 21 00:13:30 crc kubenswrapper[4906]: E0221 00:13:30.690273 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3363856-b982-4c14-bc21-d76db47e5e1e" containerName="controller-manager" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.690298 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3363856-b982-4c14-bc21-d76db47e5e1e" containerName="controller-manager" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.690502 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3363856-b982-4c14-bc21-d76db47e5e1e" containerName="controller-manager" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.697425 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd"] Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.697540 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.771470 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b1946e-883e-4d01-84f6-8d0272bf01f5-serving-cert\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.771551 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27kk6\" (UniqueName: \"kubernetes.io/projected/f7b1946e-883e-4d01-84f6-8d0272bf01f5-kube-api-access-27kk6\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.771778 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-proxy-ca-bundles\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.771900 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-client-ca\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.771950 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-config\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.772096 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.772130 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.772143 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zn2x\" (UniqueName: \"kubernetes.io/projected/a3363856-b982-4c14-bc21-d76db47e5e1e-kube-api-access-6zn2x\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.772158 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3363856-b982-4c14-bc21-d76db47e5e1e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.772168 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3363856-b982-4c14-bc21-d76db47e5e1e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.873281 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-client-ca\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.873379 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-config\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.873437 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b1946e-883e-4d01-84f6-8d0272bf01f5-serving-cert\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.873608 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27kk6\" (UniqueName: \"kubernetes.io/projected/f7b1946e-883e-4d01-84f6-8d0272bf01f5-kube-api-access-27kk6\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.873730 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-proxy-ca-bundles\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.874943 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-client-ca\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.875247 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-config\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.876901 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7b1946e-883e-4d01-84f6-8d0272bf01f5-proxy-ca-bundles\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.879522 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b1946e-883e-4d01-84f6-8d0272bf01f5-serving-cert\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:30 crc kubenswrapper[4906]: I0221 00:13:30.903571 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27kk6\" (UniqueName: \"kubernetes.io/projected/f7b1946e-883e-4d01-84f6-8d0272bf01f5-kube-api-access-27kk6\") pod \"controller-manager-7878f9dfd9-bzdkd\" (UID: \"f7b1946e-883e-4d01-84f6-8d0272bf01f5\") " pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.025177 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.256623 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd"] Feb 21 00:13:31 crc kubenswrapper[4906]: W0221 00:13:31.259123 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b1946e_883e_4d01_84f6_8d0272bf01f5.slice/crio-5a0309a93cf7f0ec81ab2de00dc151e17ab96c0f22d81f690a4daa8583708967 WatchSource:0}: Error finding container 5a0309a93cf7f0ec81ab2de00dc151e17ab96c0f22d81f690a4daa8583708967: Status 404 returned error can't find the container with id 5a0309a93cf7f0ec81ab2de00dc151e17ab96c0f22d81f690a4daa8583708967 Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.549129 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.549190 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f975d5594-lbftw" event={"ID":"a3363856-b982-4c14-bc21-d76db47e5e1e","Type":"ContainerDied","Data":"7a763b99789fbcc70978c81a7ad8ba355edbbbb3de443f10d7122005ceb7eeda"} Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.549590 4906 scope.go:117] "RemoveContainer" containerID="e0542cf118b729e6ec235f8b7e6e7af6de7470670c2f310c36f72e630b6bad34" Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.551404 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" event={"ID":"f7b1946e-883e-4d01-84f6-8d0272bf01f5","Type":"ContainerStarted","Data":"5a0309a93cf7f0ec81ab2de00dc151e17ab96c0f22d81f690a4daa8583708967"} Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.551982 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.554986 4906 patch_prober.go:28] interesting pod/controller-manager-7878f9dfd9-bzdkd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.555111 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" podUID="f7b1946e-883e-4d01-84f6-8d0272bf01f5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.583377 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" podStartSLOduration=2.5833573149999998 podStartE2EDuration="2.583357315s" podCreationTimestamp="2026-02-21 00:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:13:31.577846397 +0000 UTC m=+346.829433913" watchObservedRunningTime="2026-02-21 00:13:31.583357315 +0000 UTC m=+346.834944821" Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.595724 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f975d5594-lbftw"] Feb 21 00:13:31 crc kubenswrapper[4906]: I0221 00:13:31.600464 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f975d5594-lbftw"] Feb 21 00:13:32 crc kubenswrapper[4906]: I0221 00:13:32.591604 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" event={"ID":"f7b1946e-883e-4d01-84f6-8d0272bf01f5","Type":"ContainerStarted","Data":"a5cfce84c1f4aa50729e21afa640f0a445b4d3f7d29ca24488650ea827aa4d5e"} Feb 21 00:13:32 crc kubenswrapper[4906]: I0221 00:13:32.600252 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7878f9dfd9-bzdkd" Feb 21 00:13:32 crc kubenswrapper[4906]: I0221 00:13:32.947753 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6flp8" Feb 21 00:13:33 crc kubenswrapper[4906]: I0221 00:13:33.017072 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2xzp4"] Feb 21 00:13:33 crc kubenswrapper[4906]: I0221 00:13:33.529504 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3363856-b982-4c14-bc21-d76db47e5e1e" path="/var/lib/kubelet/pods/a3363856-b982-4c14-bc21-d76db47e5e1e/volumes" Feb 21 00:13:36 crc kubenswrapper[4906]: I0221 00:13:35.854392 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:36 crc kubenswrapper[4906]: I0221 00:13:35.912111 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q75mj" Feb 21 00:13:43 crc kubenswrapper[4906]: I0221 00:13:43.124000 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:13:43 crc kubenswrapper[4906]: I0221 00:13:43.126240 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.069287 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" podUID="cdadab89-f0cf-4bd7-af7e-17c67a65688a" containerName="registry" containerID="cri-o://3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a" gracePeriod=30 Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.570412 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.689567 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-certificates\") pod \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.689659 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2rpk\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-kube-api-access-m2rpk\") pod \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.689735 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-bound-sa-token\") pod \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.689805 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cdadab89-f0cf-4bd7-af7e-17c67a65688a-ca-trust-extracted\") pod \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.689852 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cdadab89-f0cf-4bd7-af7e-17c67a65688a-installation-pull-secrets\") pod \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.690052 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.690111 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-trusted-ca\") pod \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.690156 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-tls\") pod \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\" (UID: \"cdadab89-f0cf-4bd7-af7e-17c67a65688a\") " Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.691830 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cdadab89-f0cf-4bd7-af7e-17c67a65688a" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.692329 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cdadab89-f0cf-4bd7-af7e-17c67a65688a" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.698150 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cdadab89-f0cf-4bd7-af7e-17c67a65688a" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.699037 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cdadab89-f0cf-4bd7-af7e-17c67a65688a" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.699425 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-kube-api-access-m2rpk" (OuterVolumeSpecName: "kube-api-access-m2rpk") pod "cdadab89-f0cf-4bd7-af7e-17c67a65688a" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a"). InnerVolumeSpecName "kube-api-access-m2rpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.699617 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdadab89-f0cf-4bd7-af7e-17c67a65688a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cdadab89-f0cf-4bd7-af7e-17c67a65688a" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.707082 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cdadab89-f0cf-4bd7-af7e-17c67a65688a" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.718566 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdadab89-f0cf-4bd7-af7e-17c67a65688a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cdadab89-f0cf-4bd7-af7e-17c67a65688a" (UID: "cdadab89-f0cf-4bd7-af7e-17c67a65688a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.753063 4906 generic.go:334] "Generic (PLEG): container finished" podID="cdadab89-f0cf-4bd7-af7e-17c67a65688a" containerID="3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a" exitCode=0 Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.753154 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.753195 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" event={"ID":"cdadab89-f0cf-4bd7-af7e-17c67a65688a","Type":"ContainerDied","Data":"3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a"} Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.753761 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2xzp4" event={"ID":"cdadab89-f0cf-4bd7-af7e-17c67a65688a","Type":"ContainerDied","Data":"86e7a5b8618ec7c0d567f3ab48ab570af83b82d1864b1cb46276f6093e9c6295"} Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.753808 4906 scope.go:117] "RemoveContainer" containerID="3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.773533 4906 scope.go:117] "RemoveContainer" containerID="3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a" Feb 21 00:13:58 crc kubenswrapper[4906]: E0221 00:13:58.774066 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a\": container with ID starting with 3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a not found: ID does not exist" containerID="3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.774129 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a"} err="failed to get container status \"3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a\": rpc error: code = NotFound desc = could not find container \"3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a\": container with ID starting with 3e3a3191e1ec4ce4a512c8b0e42f22812ad4fa09790c420d4caffa085b32019a not found: ID does not exist" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.792158 4906 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cdadab89-f0cf-4bd7-af7e-17c67a65688a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.792201 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.792214 4906 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.792225 4906 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cdadab89-f0cf-4bd7-af7e-17c67a65688a-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.792238 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2rpk\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-kube-api-access-m2rpk\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.792250 4906 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdadab89-f0cf-4bd7-af7e-17c67a65688a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.792265 4906 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cdadab89-f0cf-4bd7-af7e-17c67a65688a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.810707 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2xzp4"] Feb 21 00:13:58 crc kubenswrapper[4906]: I0221 00:13:58.817413 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2xzp4"] Feb 21 00:13:59 crc kubenswrapper[4906]: I0221 00:13:59.545112 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdadab89-f0cf-4bd7-af7e-17c67a65688a" path="/var/lib/kubelet/pods/cdadab89-f0cf-4bd7-af7e-17c67a65688a/volumes" Feb 21 00:14:13 crc kubenswrapper[4906]: I0221 00:14:13.124497 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:14:13 crc kubenswrapper[4906]: I0221 00:14:13.125338 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:14:43 crc kubenswrapper[4906]: I0221 00:14:43.123410 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:14:43 crc kubenswrapper[4906]: I0221 00:14:43.124003 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:14:43 crc kubenswrapper[4906]: I0221 00:14:43.124053 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:14:43 crc kubenswrapper[4906]: I0221 00:14:43.124635 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"202df608ff6999e26cdc9a4d938cc29189ef1bc902ce31f6c1f085f6055345ad"} pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 00:14:43 crc kubenswrapper[4906]: I0221 00:14:43.124707 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" containerID="cri-o://202df608ff6999e26cdc9a4d938cc29189ef1bc902ce31f6c1f085f6055345ad" gracePeriod=600 Feb 21 00:14:44 crc kubenswrapper[4906]: I0221 00:14:44.050404 4906 generic.go:334] "Generic (PLEG): container finished" podID="17518505-fa81-4399-b6cd-5527dae35ef3" containerID="202df608ff6999e26cdc9a4d938cc29189ef1bc902ce31f6c1f085f6055345ad" exitCode=0 Feb 21 00:14:44 crc kubenswrapper[4906]: I0221 00:14:44.050528 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerDied","Data":"202df608ff6999e26cdc9a4d938cc29189ef1bc902ce31f6c1f085f6055345ad"} Feb 21 00:14:44 crc kubenswrapper[4906]: I0221 00:14:44.050854 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"6169b545a7a99c9b79ad6bca069941dcc1b133c237cbad0be59cce8e9e5cf28f"} Feb 21 00:14:44 crc kubenswrapper[4906]: I0221 00:14:44.050887 4906 scope.go:117] "RemoveContainer" containerID="c753f098aae83a1b91b668b00166c9de9e5fc03f7a39708263241e934d83fb81" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.193511 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8"] Feb 21 00:15:00 crc kubenswrapper[4906]: E0221 00:15:00.206298 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdadab89-f0cf-4bd7-af7e-17c67a65688a" containerName="registry" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.206555 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdadab89-f0cf-4bd7-af7e-17c67a65688a" containerName="registry" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.208593 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdadab89-f0cf-4bd7-af7e-17c67a65688a" containerName="registry" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.210090 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.212628 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.212763 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.213522 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8"] Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.330041 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-config-volume\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.330362 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qn9s\" (UniqueName: \"kubernetes.io/projected/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-kube-api-access-4qn9s\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.330431 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-secret-volume\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.431387 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qn9s\" (UniqueName: \"kubernetes.io/projected/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-kube-api-access-4qn9s\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.431438 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-secret-volume\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.431471 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-config-volume\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.432397 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-config-volume\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.438881 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-secret-volume\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.450562 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qn9s\" (UniqueName: \"kubernetes.io/projected/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-kube-api-access-4qn9s\") pod \"collect-profiles-29527215-z96j8\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.540265 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:00 crc kubenswrapper[4906]: I0221 00:15:00.736950 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8"] Feb 21 00:15:00 crc kubenswrapper[4906]: W0221 00:15:00.744779 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fd26259_dfb9_48bc_9dd0_74b3bdf011d4.slice/crio-c3d639eda6e97f08c5f5790636339c482ab047a8a6f36bbb87a916458b7ec989 WatchSource:0}: Error finding container c3d639eda6e97f08c5f5790636339c482ab047a8a6f36bbb87a916458b7ec989: Status 404 returned error can't find the container with id c3d639eda6e97f08c5f5790636339c482ab047a8a6f36bbb87a916458b7ec989 Feb 21 00:15:01 crc kubenswrapper[4906]: I0221 00:15:01.172404 4906 generic.go:334] "Generic (PLEG): container finished" podID="4fd26259-dfb9-48bc-9dd0-74b3bdf011d4" containerID="fb12c5e79846a736c067906c08111f3628dfea12ecd273e402d100969ce92e52" exitCode=0 Feb 21 00:15:01 crc kubenswrapper[4906]: I0221 00:15:01.172478 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" event={"ID":"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4","Type":"ContainerDied","Data":"fb12c5e79846a736c067906c08111f3628dfea12ecd273e402d100969ce92e52"} Feb 21 00:15:01 crc kubenswrapper[4906]: I0221 00:15:01.172991 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" event={"ID":"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4","Type":"ContainerStarted","Data":"c3d639eda6e97f08c5f5790636339c482ab047a8a6f36bbb87a916458b7ec989"} Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.427571 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.458384 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-secret-volume\") pod \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.458478 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qn9s\" (UniqueName: \"kubernetes.io/projected/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-kube-api-access-4qn9s\") pod \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.458511 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-config-volume\") pod \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\" (UID: \"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4\") " Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.459747 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "4fd26259-dfb9-48bc-9dd0-74b3bdf011d4" (UID: "4fd26259-dfb9-48bc-9dd0-74b3bdf011d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.465883 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4fd26259-dfb9-48bc-9dd0-74b3bdf011d4" (UID: "4fd26259-dfb9-48bc-9dd0-74b3bdf011d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.465921 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-kube-api-access-4qn9s" (OuterVolumeSpecName: "kube-api-access-4qn9s") pod "4fd26259-dfb9-48bc-9dd0-74b3bdf011d4" (UID: "4fd26259-dfb9-48bc-9dd0-74b3bdf011d4"). InnerVolumeSpecName "kube-api-access-4qn9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.559643 4906 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.559721 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qn9s\" (UniqueName: \"kubernetes.io/projected/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-kube-api-access-4qn9s\") on node \"crc\" DevicePath \"\"" Feb 21 00:15:02 crc kubenswrapper[4906]: I0221 00:15:02.559734 4906 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fd26259-dfb9-48bc-9dd0-74b3bdf011d4-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:15:03 crc kubenswrapper[4906]: I0221 00:15:03.190822 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" event={"ID":"4fd26259-dfb9-48bc-9dd0-74b3bdf011d4","Type":"ContainerDied","Data":"c3d639eda6e97f08c5f5790636339c482ab047a8a6f36bbb87a916458b7ec989"} Feb 21 00:15:03 crc kubenswrapper[4906]: I0221 00:15:03.190872 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3d639eda6e97f08c5f5790636339c482ab047a8a6f36bbb87a916458b7ec989" Feb 21 00:15:03 crc kubenswrapper[4906]: I0221 00:15:03.190905 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527215-z96j8" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.487508 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bmsd9"] Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.488376 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="nbdb" containerID="cri-o://ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf" gracePeriod=30 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.488562 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="sbdb" containerID="cri-o://1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7" gracePeriod=30 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.488627 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006" gracePeriod=30 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.488636 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="northd" containerID="cri-o://2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671" gracePeriod=30 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.488712 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kube-rbac-proxy-node" containerID="cri-o://2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351" gracePeriod=30 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.488750 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovn-acl-logging" containerID="cri-o://e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1" gracePeriod=30 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.488801 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovn-controller" containerID="cri-o://f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa" gracePeriod=30 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.542529 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" containerID="cri-o://753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112" gracePeriod=30 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.806327 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/2.log" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.807414 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/1.log" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.807487 4906 generic.go:334] "Generic (PLEG): container finished" podID="d15db4e7-a13a-4bd9-8083-1ed09be64a82" containerID="02d81b2a66d5bb33ec6a29de8fb43e80fcad7b637e052f4f4f0dd20e1c091ab4" exitCode=2 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.807581 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqkxl" event={"ID":"d15db4e7-a13a-4bd9-8083-1ed09be64a82","Type":"ContainerDied","Data":"02d81b2a66d5bb33ec6a29de8fb43e80fcad7b637e052f4f4f0dd20e1c091ab4"} Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.807670 4906 scope.go:117] "RemoveContainer" containerID="3b20b532af977461f36343fe3b9c58f154e726adea87f4f9c31c95a7c46dc495" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.808671 4906 scope.go:117] "RemoveContainer" containerID="02d81b2a66d5bb33ec6a29de8fb43e80fcad7b637e052f4f4f0dd20e1c091ab4" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.809057 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-cqkxl_openshift-multus(d15db4e7-a13a-4bd9-8083-1ed09be64a82)\"" pod="openshift-multus/multus-cqkxl" podUID="d15db4e7-a13a-4bd9-8083-1ed09be64a82" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.821621 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/3.log" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.824440 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovn-acl-logging/0.log" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.825261 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovn-controller/0.log" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.825819 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112" exitCode=0 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.826004 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006" exitCode=0 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.826134 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351" exitCode=0 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.826244 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1" exitCode=143 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.826349 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa" exitCode=143 Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.826431 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112"} Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.826655 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006"} Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.827591 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351"} Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.827641 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1"} Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.827668 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa"} Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.856030 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovnkube-controller/3.log" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.858643 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovn-acl-logging/0.log" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.859127 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovn-controller/0.log" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.859452 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.863341 4906 scope.go:117] "RemoveContainer" containerID="8d03248e0e19e586e9f1d4075918c0c678e58d924fb7daf7c7c30daca5f732a8" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908024 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cgbck"] Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908210 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="sbdb" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908220 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="sbdb" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908232 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kubecfg-setup" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908238 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kubecfg-setup" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908247 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovn-acl-logging" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908254 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovn-acl-logging" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908260 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kube-rbac-proxy-node" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908265 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kube-rbac-proxy-node" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908274 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kube-rbac-proxy-ovn-metrics" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908279 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kube-rbac-proxy-ovn-metrics" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908288 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908293 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908300 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="nbdb" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908305 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="nbdb" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908316 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="northd" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908323 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="northd" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908333 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovn-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908340 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovn-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908349 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd26259-dfb9-48bc-9dd0-74b3bdf011d4" containerName="collect-profiles" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908354 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd26259-dfb9-48bc-9dd0-74b3bdf011d4" containerName="collect-profiles" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908361 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908367 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908374 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908380 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908387 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908393 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908486 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908495 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovn-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908507 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kube-rbac-proxy-node" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908515 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="sbdb" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908523 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovn-acl-logging" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908529 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="northd" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908537 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908544 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd26259-dfb9-48bc-9dd0-74b3bdf011d4" containerName="collect-profiles" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908551 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908557 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908565 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="nbdb" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908573 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="kube-rbac-proxy-ovn-metrics" Feb 21 00:16:34 crc kubenswrapper[4906]: E0221 00:16:34.908655 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908662 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.908777 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" containerName="ovnkube-controller" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.910201 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950212 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-node-log\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950252 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-systemd-units\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950279 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-netns\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950304 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-config\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950338 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-bin\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950362 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-script-lib\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950368 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950374 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-node-log" (OuterVolumeSpecName: "node-log") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950384 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sbct\" (UniqueName: \"kubernetes.io/projected/23efa997-378b-44cd-9f05-4a80559cd09b-kube-api-access-8sbct\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950406 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950420 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-netd\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950450 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-env-overrides\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950475 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-var-lib-openvswitch\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950498 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-ovn\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950523 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950578 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-etc-openvswitch\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950598 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-log-socket\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950617 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-kubelet\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950487 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950574 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950640 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-ovn-kubernetes\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950674 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23efa997-378b-44cd-9f05-4a80559cd09b-ovn-node-metrics-cert\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950722 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-slash\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950563 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950743 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950639 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950658 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-log-socket" (OuterVolumeSpecName: "log-socket") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950674 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950713 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950776 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950676 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950751 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-openvswitch\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950819 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-systemd\") pod \"23efa997-378b-44cd-9f05-4a80559cd09b\" (UID: \"23efa997-378b-44cd-9f05-4a80559cd09b\") " Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950797 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-slash" (OuterVolumeSpecName: "host-slash") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950817 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950866 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950917 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.950972 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovnkube-config\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951001 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xbs\" (UniqueName: \"kubernetes.io/projected/dd589714-d0d7-4e2c-8f71-9cd57436f267-kube-api-access-l2xbs\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951030 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951181 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-var-lib-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951261 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-log-socket\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951315 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-kubelet\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951349 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-slash\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951377 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-run-ovn-kubernetes\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951437 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-etc-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951480 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovn-node-metrics-cert\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951600 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-systemd\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951670 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-env-overrides\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951746 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-cni-netd\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951830 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-cni-bin\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951894 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-node-log\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951921 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-systemd-units\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.951976 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovnkube-script-lib\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952002 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952029 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-run-netns\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952059 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-ovn\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952293 4906 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952316 4906 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-log-socket\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952328 4906 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952352 4906 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952368 4906 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-slash\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952380 4906 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952390 4906 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-node-log\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952405 4906 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952416 4906 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952428 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952439 4906 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952449 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952459 4906 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952470 4906 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23efa997-378b-44cd-9f05-4a80559cd09b-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952482 4906 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952495 4906 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.952506 4906 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.956003 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23efa997-378b-44cd-9f05-4a80559cd09b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.956552 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23efa997-378b-44cd-9f05-4a80559cd09b-kube-api-access-8sbct" (OuterVolumeSpecName: "kube-api-access-8sbct") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "kube-api-access-8sbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:16:34 crc kubenswrapper[4906]: I0221 00:16:34.963242 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "23efa997-378b-44cd-9f05-4a80559cd09b" (UID: "23efa997-378b-44cd-9f05-4a80559cd09b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.053876 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovn-node-metrics-cert\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.053955 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-systemd\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.053992 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-env-overrides\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054028 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-cni-netd\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054070 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-cni-bin\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054107 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-node-log\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054138 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-systemd-units\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054142 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-systemd\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054171 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovnkube-script-lib\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054176 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-cni-netd\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054256 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054269 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-node-log\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054310 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-run-netns\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054283 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-run-netns\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054340 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054371 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-systemd-units\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054377 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-ovn\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054430 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-run-ovn\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054499 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-cni-bin\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054528 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovnkube-config\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054606 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xbs\" (UniqueName: \"kubernetes.io/projected/dd589714-d0d7-4e2c-8f71-9cd57436f267-kube-api-access-l2xbs\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054676 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054900 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-env-overrides\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054947 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-var-lib-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.054937 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055056 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-var-lib-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055270 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-log-socket\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055366 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-log-socket\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055455 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-kubelet\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055538 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovnkube-config\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055558 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-kubelet\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055629 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-slash\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055727 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-slash\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055766 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-run-ovn-kubernetes\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.055825 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-host-run-ovn-kubernetes\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.056140 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovnkube-script-lib\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.056144 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-etc-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.056243 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd589714-d0d7-4e2c-8f71-9cd57436f267-etc-openvswitch\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.056282 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sbct\" (UniqueName: \"kubernetes.io/projected/23efa997-378b-44cd-9f05-4a80559cd09b-kube-api-access-8sbct\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.056428 4906 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23efa997-378b-44cd-9f05-4a80559cd09b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.056450 4906 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23efa997-378b-44cd-9f05-4a80559cd09b-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.059976 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd589714-d0d7-4e2c-8f71-9cd57436f267-ovn-node-metrics-cert\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.075477 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xbs\" (UniqueName: \"kubernetes.io/projected/dd589714-d0d7-4e2c-8f71-9cd57436f267-kube-api-access-l2xbs\") pod \"ovnkube-node-cgbck\" (UID: \"dd589714-d0d7-4e2c-8f71-9cd57436f267\") " pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.223497 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:35 crc kubenswrapper[4906]: W0221 00:16:35.243942 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd589714_d0d7_4e2c_8f71_9cd57436f267.slice/crio-dfc026903f8790291bed38a74b680a51c68adbf3adf85754c3b298231039cec0 WatchSource:0}: Error finding container dfc026903f8790291bed38a74b680a51c68adbf3adf85754c3b298231039cec0: Status 404 returned error can't find the container with id dfc026903f8790291bed38a74b680a51c68adbf3adf85754c3b298231039cec0 Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.835463 4906 generic.go:334] "Generic (PLEG): container finished" podID="dd589714-d0d7-4e2c-8f71-9cd57436f267" containerID="22739fc0528826ba907da4f6325b51f89b773fe34e44f467c43ba807026548e1" exitCode=0 Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.835566 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerDied","Data":"22739fc0528826ba907da4f6325b51f89b773fe34e44f467c43ba807026548e1"} Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.835624 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"dfc026903f8790291bed38a74b680a51c68adbf3adf85754c3b298231039cec0"} Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.843727 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovn-acl-logging/0.log" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.844362 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bmsd9_23efa997-378b-44cd-9f05-4a80559cd09b/ovn-controller/0.log" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846026 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7" exitCode=0 Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846048 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf" exitCode=0 Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846058 4906 generic.go:334] "Generic (PLEG): container finished" podID="23efa997-378b-44cd-9f05-4a80559cd09b" containerID="2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671" exitCode=0 Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846115 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7"} Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846143 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf"} Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846158 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671"} Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846170 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" event={"ID":"23efa997-378b-44cd-9f05-4a80559cd09b","Type":"ContainerDied","Data":"b9647615efe8406611f1044cef4c83e89d101c0e78f7f75cf1910d1a11711b91"} Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846190 4906 scope.go:117] "RemoveContainer" containerID="753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.846374 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bmsd9" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.854568 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/2.log" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.894350 4906 scope.go:117] "RemoveContainer" containerID="1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.919991 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bmsd9"] Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.931631 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bmsd9"] Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.934737 4906 scope.go:117] "RemoveContainer" containerID="ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.957495 4906 scope.go:117] "RemoveContainer" containerID="2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.969775 4906 scope.go:117] "RemoveContainer" containerID="b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.984566 4906 scope.go:117] "RemoveContainer" containerID="2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351" Feb 21 00:16:35 crc kubenswrapper[4906]: I0221 00:16:35.997461 4906 scope.go:117] "RemoveContainer" containerID="e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.014997 4906 scope.go:117] "RemoveContainer" containerID="f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.033067 4906 scope.go:117] "RemoveContainer" containerID="352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.054089 4906 scope.go:117] "RemoveContainer" containerID="753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.054621 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112\": container with ID starting with 753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112 not found: ID does not exist" containerID="753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.054646 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112"} err="failed to get container status \"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112\": rpc error: code = NotFound desc = could not find container \"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112\": container with ID starting with 753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.054665 4906 scope.go:117] "RemoveContainer" containerID="1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.055019 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\": container with ID starting with 1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7 not found: ID does not exist" containerID="1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.055035 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7"} err="failed to get container status \"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\": rpc error: code = NotFound desc = could not find container \"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\": container with ID starting with 1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.055054 4906 scope.go:117] "RemoveContainer" containerID="ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.055266 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\": container with ID starting with ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf not found: ID does not exist" containerID="ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.055284 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf"} err="failed to get container status \"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\": rpc error: code = NotFound desc = could not find container \"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\": container with ID starting with ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.055296 4906 scope.go:117] "RemoveContainer" containerID="2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.055558 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\": container with ID starting with 2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671 not found: ID does not exist" containerID="2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.055576 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671"} err="failed to get container status \"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\": rpc error: code = NotFound desc = could not find container \"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\": container with ID starting with 2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.055588 4906 scope.go:117] "RemoveContainer" containerID="b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.055807 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\": container with ID starting with b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006 not found: ID does not exist" containerID="b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.055824 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006"} err="failed to get container status \"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\": rpc error: code = NotFound desc = could not find container \"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\": container with ID starting with b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.055836 4906 scope.go:117] "RemoveContainer" containerID="2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.056110 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\": container with ID starting with 2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351 not found: ID does not exist" containerID="2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.056125 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351"} err="failed to get container status \"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\": rpc error: code = NotFound desc = could not find container \"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\": container with ID starting with 2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.056136 4906 scope.go:117] "RemoveContainer" containerID="e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.056436 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\": container with ID starting with e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1 not found: ID does not exist" containerID="e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.056452 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1"} err="failed to get container status \"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\": rpc error: code = NotFound desc = could not find container \"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\": container with ID starting with e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.056462 4906 scope.go:117] "RemoveContainer" containerID="f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.056726 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\": container with ID starting with f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa not found: ID does not exist" containerID="f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.056746 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa"} err="failed to get container status \"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\": rpc error: code = NotFound desc = could not find container \"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\": container with ID starting with f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.056758 4906 scope.go:117] "RemoveContainer" containerID="352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668" Feb 21 00:16:36 crc kubenswrapper[4906]: E0221 00:16:36.056930 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\": container with ID starting with 352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668 not found: ID does not exist" containerID="352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.056950 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668"} err="failed to get container status \"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\": rpc error: code = NotFound desc = could not find container \"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\": container with ID starting with 352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.056962 4906 scope.go:117] "RemoveContainer" containerID="753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.057209 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112"} err="failed to get container status \"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112\": rpc error: code = NotFound desc = could not find container \"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112\": container with ID starting with 753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.057227 4906 scope.go:117] "RemoveContainer" containerID="1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.057422 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7"} err="failed to get container status \"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\": rpc error: code = NotFound desc = could not find container \"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\": container with ID starting with 1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.057440 4906 scope.go:117] "RemoveContainer" containerID="ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.057737 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf"} err="failed to get container status \"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\": rpc error: code = NotFound desc = could not find container \"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\": container with ID starting with ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.057757 4906 scope.go:117] "RemoveContainer" containerID="2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.058253 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671"} err="failed to get container status \"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\": rpc error: code = NotFound desc = could not find container \"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\": container with ID starting with 2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.058271 4906 scope.go:117] "RemoveContainer" containerID="b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.058531 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006"} err="failed to get container status \"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\": rpc error: code = NotFound desc = could not find container \"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\": container with ID starting with b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.058548 4906 scope.go:117] "RemoveContainer" containerID="2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.058730 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351"} err="failed to get container status \"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\": rpc error: code = NotFound desc = could not find container \"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\": container with ID starting with 2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.058747 4906 scope.go:117] "RemoveContainer" containerID="e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059038 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1"} err="failed to get container status \"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\": rpc error: code = NotFound desc = could not find container \"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\": container with ID starting with e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059055 4906 scope.go:117] "RemoveContainer" containerID="f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059243 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa"} err="failed to get container status \"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\": rpc error: code = NotFound desc = could not find container \"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\": container with ID starting with f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059261 4906 scope.go:117] "RemoveContainer" containerID="352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059480 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668"} err="failed to get container status \"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\": rpc error: code = NotFound desc = could not find container \"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\": container with ID starting with 352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059500 4906 scope.go:117] "RemoveContainer" containerID="753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059672 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112"} err="failed to get container status \"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112\": rpc error: code = NotFound desc = could not find container \"753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112\": container with ID starting with 753df9d7bebdaf579e7460708758059a58b28e10413d71efcc87a3c3e021c112 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059702 4906 scope.go:117] "RemoveContainer" containerID="1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.059998 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7"} err="failed to get container status \"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\": rpc error: code = NotFound desc = could not find container \"1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7\": container with ID starting with 1ef79e3c346342fa9d098b503a7dff636ce88cb88465cf6e802b9581980e46d7 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.060011 4906 scope.go:117] "RemoveContainer" containerID="ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.060282 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf"} err="failed to get container status \"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\": rpc error: code = NotFound desc = could not find container \"ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf\": container with ID starting with ead16930cc675e04c24dd469e0e7f7ada74a7ef3fe87e095e6702204361d89bf not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.060305 4906 scope.go:117] "RemoveContainer" containerID="2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.060656 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671"} err="failed to get container status \"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\": rpc error: code = NotFound desc = could not find container \"2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671\": container with ID starting with 2cd0d9e0d6ac22632f0ca07c3ad926d99b71e9d1757e8ef0ee87a2008afbc671 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.060670 4906 scope.go:117] "RemoveContainer" containerID="b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.060838 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006"} err="failed to get container status \"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\": rpc error: code = NotFound desc = could not find container \"b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006\": container with ID starting with b1574681de69105a2b0693609c82ef8c0cd554037fe2f4a2fd7345650bf4a006 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.060853 4906 scope.go:117] "RemoveContainer" containerID="2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.061103 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351"} err="failed to get container status \"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\": rpc error: code = NotFound desc = could not find container \"2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351\": container with ID starting with 2aae4c734bcf6dc35c8437a49887777a17b228f7196bf2d77a8179923f121351 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.061119 4906 scope.go:117] "RemoveContainer" containerID="e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.061357 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1"} err="failed to get container status \"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\": rpc error: code = NotFound desc = could not find container \"e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1\": container with ID starting with e86b35bdd91ab61ef7469c956499be0dd990275293748f4fb8f43f2f42d84fc1 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.061373 4906 scope.go:117] "RemoveContainer" containerID="f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.061622 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa"} err="failed to get container status \"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\": rpc error: code = NotFound desc = could not find container \"f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa\": container with ID starting with f8f4202ebd523d2b63c5625f20cb60477ce419a5b34d926dccab2661b2c107aa not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.061641 4906 scope.go:117] "RemoveContainer" containerID="352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.061998 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668"} err="failed to get container status \"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\": rpc error: code = NotFound desc = could not find container \"352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668\": container with ID starting with 352a09dbb2d285442bced48d7bc292ad6eebab141fb22c1e962d31e7c9688668 not found: ID does not exist" Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.862871 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"f4a5587fbb2b5c7a657767f57ffe1d092e294c6e17f9f5efb9bec1a3bac859fb"} Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.863219 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"7a64357d4be4b755047c8ef79c6aa760d8b64e1c26a7cdb389cae40d9c055f4d"} Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.863250 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"a255becc662d106ae75fd7fcd4861a0b5ef0542649a9e7815966d3ace73dc03f"} Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.863272 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"dabb70046f37dfa8f717850bdcbeeff5a626f2a73ef8de0f91f3e7009844797b"} Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.863296 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"c4694a53afb44e943c6c8c789d1afd4ddc05994145aa0464dbd950c4312e9a63"} Feb 21 00:16:36 crc kubenswrapper[4906]: I0221 00:16:36.863326 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"676abf286d2fbba72960a8a3b41bf9b08bdc59b23c54e62b8afb7c64ada5bd61"} Feb 21 00:16:37 crc kubenswrapper[4906]: I0221 00:16:37.529489 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23efa997-378b-44cd-9f05-4a80559cd09b" path="/var/lib/kubelet/pods/23efa997-378b-44cd-9f05-4a80559cd09b/volumes" Feb 21 00:16:39 crc kubenswrapper[4906]: I0221 00:16:39.889075 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"c8df67e7b4ef11ab7cbf55388df5cd4985b989ef49082cc7ee25283bf0d43e07"} Feb 21 00:16:41 crc kubenswrapper[4906]: I0221 00:16:41.905411 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" event={"ID":"dd589714-d0d7-4e2c-8f71-9cd57436f267","Type":"ContainerStarted","Data":"8866e2acb74ba8ebc20bed66841eb2e315d9628b3a4846f729d30fd664b3e709"} Feb 21 00:16:41 crc kubenswrapper[4906]: I0221 00:16:41.905859 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:41 crc kubenswrapper[4906]: I0221 00:16:41.905896 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:41 crc kubenswrapper[4906]: I0221 00:16:41.905909 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:41 crc kubenswrapper[4906]: I0221 00:16:41.934353 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:41 crc kubenswrapper[4906]: I0221 00:16:41.942796 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:16:41 crc kubenswrapper[4906]: I0221 00:16:41.943549 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" podStartSLOduration=7.9435289220000005 podStartE2EDuration="7.943528922s" podCreationTimestamp="2026-02-21 00:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:16:41.937641189 +0000 UTC m=+537.189228715" watchObservedRunningTime="2026-02-21 00:16:41.943528922 +0000 UTC m=+537.195116438" Feb 21 00:16:43 crc kubenswrapper[4906]: I0221 00:16:43.124578 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:16:43 crc kubenswrapper[4906]: I0221 00:16:43.124950 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:16:45 crc kubenswrapper[4906]: I0221 00:16:45.520454 4906 scope.go:117] "RemoveContainer" containerID="02d81b2a66d5bb33ec6a29de8fb43e80fcad7b637e052f4f4f0dd20e1c091ab4" Feb 21 00:16:45 crc kubenswrapper[4906]: E0221 00:16:45.521081 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-cqkxl_openshift-multus(d15db4e7-a13a-4bd9-8083-1ed09be64a82)\"" pod="openshift-multus/multus-cqkxl" podUID="d15db4e7-a13a-4bd9-8083-1ed09be64a82" Feb 21 00:16:46 crc kubenswrapper[4906]: I0221 00:16:46.436131 4906 scope.go:117] "RemoveContainer" containerID="364f42a3ab65178a37891c318024ee97e337f0e7ce56e0b773056b9fca4dfe8a" Feb 21 00:16:46 crc kubenswrapper[4906]: I0221 00:16:46.454326 4906 scope.go:117] "RemoveContainer" containerID="a967b27cf6b8d5317d66b7c8eecddc59d62789af43b80a2207af1c74e83cb6b3" Feb 21 00:16:56 crc kubenswrapper[4906]: I0221 00:16:56.517864 4906 scope.go:117] "RemoveContainer" containerID="02d81b2a66d5bb33ec6a29de8fb43e80fcad7b637e052f4f4f0dd20e1c091ab4" Feb 21 00:16:57 crc kubenswrapper[4906]: I0221 00:16:57.010954 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqkxl_d15db4e7-a13a-4bd9-8083-1ed09be64a82/kube-multus/2.log" Feb 21 00:16:57 crc kubenswrapper[4906]: I0221 00:16:57.011275 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqkxl" event={"ID":"d15db4e7-a13a-4bd9-8083-1ed09be64a82","Type":"ContainerStarted","Data":"43a4086ed1dc98f41df53ac0f9c0c8f6394c2e85546c24fe377640b4d311c2b1"} Feb 21 00:17:05 crc kubenswrapper[4906]: I0221 00:17:05.259397 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cgbck" Feb 21 00:17:13 crc kubenswrapper[4906]: I0221 00:17:13.124137 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:17:13 crc kubenswrapper[4906]: I0221 00:17:13.126087 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:17:37 crc kubenswrapper[4906]: I0221 00:17:37.634581 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7vwr"] Feb 21 00:17:37 crc kubenswrapper[4906]: I0221 00:17:37.636397 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h7vwr" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerName="registry-server" containerID="cri-o://6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37" gracePeriod=30 Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.081569 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.134494 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-catalog-content\") pod \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.134567 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-utilities\") pod \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.134594 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj6cm\" (UniqueName: \"kubernetes.io/projected/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-kube-api-access-cj6cm\") pod \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\" (UID: \"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3\") " Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.135486 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-utilities" (OuterVolumeSpecName: "utilities") pod "157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" (UID: "157d9b34-7ca3-40a2-8aae-4eeab9a53ef3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.140306 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-kube-api-access-cj6cm" (OuterVolumeSpecName: "kube-api-access-cj6cm") pod "157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" (UID: "157d9b34-7ca3-40a2-8aae-4eeab9a53ef3"). InnerVolumeSpecName "kube-api-access-cj6cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.160220 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" (UID: "157d9b34-7ca3-40a2-8aae-4eeab9a53ef3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.235950 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.235982 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.235992 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj6cm\" (UniqueName: \"kubernetes.io/projected/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3-kube-api-access-cj6cm\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.298960 4906 generic.go:334] "Generic (PLEG): container finished" podID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerID="6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37" exitCode=0 Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.299020 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7vwr" event={"ID":"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3","Type":"ContainerDied","Data":"6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37"} Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.299061 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7vwr" event={"ID":"157d9b34-7ca3-40a2-8aae-4eeab9a53ef3","Type":"ContainerDied","Data":"896fd9cbdefda09e1f49abc5aa94712d011cecba655317b2c522078fce744796"} Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.299090 4906 scope.go:117] "RemoveContainer" containerID="6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.299248 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7vwr" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.322462 4906 scope.go:117] "RemoveContainer" containerID="5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.348063 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7vwr"] Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.355281 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7vwr"] Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.355309 4906 scope.go:117] "RemoveContainer" containerID="3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.374853 4906 scope.go:117] "RemoveContainer" containerID="6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37" Feb 21 00:17:38 crc kubenswrapper[4906]: E0221 00:17:38.375286 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37\": container with ID starting with 6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37 not found: ID does not exist" containerID="6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.375322 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37"} err="failed to get container status \"6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37\": rpc error: code = NotFound desc = could not find container \"6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37\": container with ID starting with 6a1a7125e367a5c61dc550ecaf4db901c0b704d721b183221c1fdaf1e47e1a37 not found: ID does not exist" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.375348 4906 scope.go:117] "RemoveContainer" containerID="5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1" Feb 21 00:17:38 crc kubenswrapper[4906]: E0221 00:17:38.375679 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1\": container with ID starting with 5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1 not found: ID does not exist" containerID="5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.375790 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1"} err="failed to get container status \"5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1\": rpc error: code = NotFound desc = could not find container \"5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1\": container with ID starting with 5540348061dce98b132bf6558a8f39908a3be08ce006e2c306936933762d4de1 not found: ID does not exist" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.375832 4906 scope.go:117] "RemoveContainer" containerID="3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84" Feb 21 00:17:38 crc kubenswrapper[4906]: E0221 00:17:38.376246 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84\": container with ID starting with 3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84 not found: ID does not exist" containerID="3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84" Feb 21 00:17:38 crc kubenswrapper[4906]: I0221 00:17:38.376284 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84"} err="failed to get container status \"3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84\": rpc error: code = NotFound desc = could not find container \"3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84\": container with ID starting with 3b712648705473242521ed78cc2406fac67ca880a168ea9d482edb728bbecf84 not found: ID does not exist" Feb 21 00:17:39 crc kubenswrapper[4906]: I0221 00:17:39.524250 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" path="/var/lib/kubelet/pods/157d9b34-7ca3-40a2-8aae-4eeab9a53ef3/volumes" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.337924 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb"] Feb 21 00:17:41 crc kubenswrapper[4906]: E0221 00:17:41.338325 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerName="registry-server" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.338357 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerName="registry-server" Feb 21 00:17:41 crc kubenswrapper[4906]: E0221 00:17:41.338377 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerName="extract-utilities" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.338394 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerName="extract-utilities" Feb 21 00:17:41 crc kubenswrapper[4906]: E0221 00:17:41.338448 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerName="extract-content" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.338467 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerName="extract-content" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.338740 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="157d9b34-7ca3-40a2-8aae-4eeab9a53ef3" containerName="registry-server" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.341377 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.347190 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.357978 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb"] Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.375880 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.376023 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2km2j\" (UniqueName: \"kubernetes.io/projected/d31a01da-9278-4a4c-a880-60bbacec4d63-kube-api-access-2km2j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.376065 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.477101 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.477194 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2km2j\" (UniqueName: \"kubernetes.io/projected/d31a01da-9278-4a4c-a880-60bbacec4d63-kube-api-access-2km2j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.477220 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.478128 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.478188 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.498905 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2km2j\" (UniqueName: \"kubernetes.io/projected/d31a01da-9278-4a4c-a880-60bbacec4d63-kube-api-access-2km2j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.670410 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:41 crc kubenswrapper[4906]: I0221 00:17:41.939381 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb"] Feb 21 00:17:42 crc kubenswrapper[4906]: I0221 00:17:42.322076 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" event={"ID":"d31a01da-9278-4a4c-a880-60bbacec4d63","Type":"ContainerStarted","Data":"32732aa4fc6b4d290ced9a1342cdc54f6f3c6766a644ff2ee9e7aa35a4ccd814"} Feb 21 00:17:42 crc kubenswrapper[4906]: I0221 00:17:42.322435 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" event={"ID":"d31a01da-9278-4a4c-a880-60bbacec4d63","Type":"ContainerStarted","Data":"58e8dd3f022b071117d465650f782aaa8dbce9053a5543d5065c786e45ed5c70"} Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.124245 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.124312 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.124358 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.124909 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6169b545a7a99c9b79ad6bca069941dcc1b133c237cbad0be59cce8e9e5cf28f"} pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.124965 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" containerID="cri-o://6169b545a7a99c9b79ad6bca069941dcc1b133c237cbad0be59cce8e9e5cf28f" gracePeriod=600 Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.332142 4906 generic.go:334] "Generic (PLEG): container finished" podID="17518505-fa81-4399-b6cd-5527dae35ef3" containerID="6169b545a7a99c9b79ad6bca069941dcc1b133c237cbad0be59cce8e9e5cf28f" exitCode=0 Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.332174 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerDied","Data":"6169b545a7a99c9b79ad6bca069941dcc1b133c237cbad0be59cce8e9e5cf28f"} Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.332222 4906 scope.go:117] "RemoveContainer" containerID="202df608ff6999e26cdc9a4d938cc29189ef1bc902ce31f6c1f085f6055345ad" Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.336258 4906 generic.go:334] "Generic (PLEG): container finished" podID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerID="32732aa4fc6b4d290ced9a1342cdc54f6f3c6766a644ff2ee9e7aa35a4ccd814" exitCode=0 Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.336317 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" event={"ID":"d31a01da-9278-4a4c-a880-60bbacec4d63","Type":"ContainerDied","Data":"32732aa4fc6b4d290ced9a1342cdc54f6f3c6766a644ff2ee9e7aa35a4ccd814"} Feb 21 00:17:43 crc kubenswrapper[4906]: I0221 00:17:43.338618 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 00:17:44 crc kubenswrapper[4906]: I0221 00:17:44.352419 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"5ba16906db7f4c001a4d75d4a61fcd36e0dcef9b8dae4fc267bc90aeebbab0e4"} Feb 21 00:17:45 crc kubenswrapper[4906]: I0221 00:17:45.360479 4906 generic.go:334] "Generic (PLEG): container finished" podID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerID="c4343cea34dfd17ea5d66a9aa631c8bc0708bcc6025f3815e5435880c250a1a7" exitCode=0 Feb 21 00:17:45 crc kubenswrapper[4906]: I0221 00:17:45.360541 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" event={"ID":"d31a01da-9278-4a4c-a880-60bbacec4d63","Type":"ContainerDied","Data":"c4343cea34dfd17ea5d66a9aa631c8bc0708bcc6025f3815e5435880c250a1a7"} Feb 21 00:17:46 crc kubenswrapper[4906]: I0221 00:17:46.370666 4906 generic.go:334] "Generic (PLEG): container finished" podID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerID="5b1d69f87c47786326fc5d510e7c7ace61555f01a1343b1185e639d5ffcb7f72" exitCode=0 Feb 21 00:17:46 crc kubenswrapper[4906]: I0221 00:17:46.370850 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" event={"ID":"d31a01da-9278-4a4c-a880-60bbacec4d63","Type":"ContainerDied","Data":"5b1d69f87c47786326fc5d510e7c7ace61555f01a1343b1185e639d5ffcb7f72"} Feb 21 00:17:46 crc kubenswrapper[4906]: I0221 00:17:46.499744 4906 scope.go:117] "RemoveContainer" containerID="7bddb1006ea71471e455bc38a151b1a8cc0a40bb8eacae4855972f4de91525b7" Feb 21 00:17:46 crc kubenswrapper[4906]: I0221 00:17:46.519843 4906 scope.go:117] "RemoveContainer" containerID="03d79fa6e580e48375be3589dab10bc8d56ad852fadd9aabf37d3c7b4d517677" Feb 21 00:17:46 crc kubenswrapper[4906]: I0221 00:17:46.541357 4906 scope.go:117] "RemoveContainer" containerID="2c9d5d6dafa68f34552253beedd93b0f829fe32732da3746a736fa19f2314f8f" Feb 21 00:17:46 crc kubenswrapper[4906]: I0221 00:17:46.553635 4906 scope.go:117] "RemoveContainer" containerID="788aaebebcfe9de8be190f1557a46dffe191b83c9f4f2f14e363d181c8f09bbd" Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.601797 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.658815 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2km2j\" (UniqueName: \"kubernetes.io/projected/d31a01da-9278-4a4c-a880-60bbacec4d63-kube-api-access-2km2j\") pod \"d31a01da-9278-4a4c-a880-60bbacec4d63\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.658901 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-bundle\") pod \"d31a01da-9278-4a4c-a880-60bbacec4d63\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.658924 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-util\") pod \"d31a01da-9278-4a4c-a880-60bbacec4d63\" (UID: \"d31a01da-9278-4a4c-a880-60bbacec4d63\") " Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.661034 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-bundle" (OuterVolumeSpecName: "bundle") pod "d31a01da-9278-4a4c-a880-60bbacec4d63" (UID: "d31a01da-9278-4a4c-a880-60bbacec4d63"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.666860 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31a01da-9278-4a4c-a880-60bbacec4d63-kube-api-access-2km2j" (OuterVolumeSpecName: "kube-api-access-2km2j") pod "d31a01da-9278-4a4c-a880-60bbacec4d63" (UID: "d31a01da-9278-4a4c-a880-60bbacec4d63"). InnerVolumeSpecName "kube-api-access-2km2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.760415 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2km2j\" (UniqueName: \"kubernetes.io/projected/d31a01da-9278-4a4c-a880-60bbacec4d63-kube-api-access-2km2j\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.760451 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.871755 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-util" (OuterVolumeSpecName: "util") pod "d31a01da-9278-4a4c-a880-60bbacec4d63" (UID: "d31a01da-9278-4a4c-a880-60bbacec4d63"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:17:47 crc kubenswrapper[4906]: I0221 00:17:47.962816 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d31a01da-9278-4a4c-a880-60bbacec4d63-util\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.319170 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg"] Feb 21 00:17:48 crc kubenswrapper[4906]: E0221 00:17:48.319406 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerName="util" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.319427 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerName="util" Feb 21 00:17:48 crc kubenswrapper[4906]: E0221 00:17:48.319442 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerName="extract" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.319451 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerName="extract" Feb 21 00:17:48 crc kubenswrapper[4906]: E0221 00:17:48.319472 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerName="pull" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.319481 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerName="pull" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.319600 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31a01da-9278-4a4c-a880-60bbacec4d63" containerName="extract" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.320506 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.334149 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg"] Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.368613 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.368776 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv7pj\" (UniqueName: \"kubernetes.io/projected/94d58122-e78f-4369-937c-ba79bf5eec59-kube-api-access-xv7pj\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.368853 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.386543 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" event={"ID":"d31a01da-9278-4a4c-a880-60bbacec4d63","Type":"ContainerDied","Data":"58e8dd3f022b071117d465650f782aaa8dbce9053a5543d5065c786e45ed5c70"} Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.386586 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e8dd3f022b071117d465650f782aaa8dbce9053a5543d5065c786e45ed5c70" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.386658 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.470403 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv7pj\" (UniqueName: \"kubernetes.io/projected/94d58122-e78f-4369-937c-ba79bf5eec59-kube-api-access-xv7pj\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.470468 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.470543 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.471299 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.471317 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.490880 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv7pj\" (UniqueName: \"kubernetes.io/projected/94d58122-e78f-4369-937c-ba79bf5eec59-kube-api-access-xv7pj\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.650279 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:48 crc kubenswrapper[4906]: I0221 00:17:48.856595 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg"] Feb 21 00:17:49 crc kubenswrapper[4906]: I0221 00:17:49.395782 4906 generic.go:334] "Generic (PLEG): container finished" podID="94d58122-e78f-4369-937c-ba79bf5eec59" containerID="1353ddb1841b9f4d037a45b639cc252892db69adb1f79975eb36442a99ad854f" exitCode=0 Feb 21 00:17:49 crc kubenswrapper[4906]: I0221 00:17:49.395841 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" event={"ID":"94d58122-e78f-4369-937c-ba79bf5eec59","Type":"ContainerDied","Data":"1353ddb1841b9f4d037a45b639cc252892db69adb1f79975eb36442a99ad854f"} Feb 21 00:17:49 crc kubenswrapper[4906]: I0221 00:17:49.395893 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" event={"ID":"94d58122-e78f-4369-937c-ba79bf5eec59","Type":"ContainerStarted","Data":"fa199d98467cf154ffa2bcc9aabf81235aa3a02143ab5d4d2879a7d515235846"} Feb 21 00:17:50 crc kubenswrapper[4906]: I0221 00:17:50.413884 4906 generic.go:334] "Generic (PLEG): container finished" podID="94d58122-e78f-4369-937c-ba79bf5eec59" containerID="2ec4e64f975cd4874280ed823d439dc5838a58eb92da95437507f14b2e0171b0" exitCode=0 Feb 21 00:17:50 crc kubenswrapper[4906]: I0221 00:17:50.413999 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" event={"ID":"94d58122-e78f-4369-937c-ba79bf5eec59","Type":"ContainerDied","Data":"2ec4e64f975cd4874280ed823d439dc5838a58eb92da95437507f14b2e0171b0"} Feb 21 00:17:51 crc kubenswrapper[4906]: I0221 00:17:51.421354 4906 generic.go:334] "Generic (PLEG): container finished" podID="94d58122-e78f-4369-937c-ba79bf5eec59" containerID="2960f466909598f8c3a987d8d30178fcf3886acbfd3a59d4805714ebdb30ea40" exitCode=0 Feb 21 00:17:51 crc kubenswrapper[4906]: I0221 00:17:51.421397 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" event={"ID":"94d58122-e78f-4369-937c-ba79bf5eec59","Type":"ContainerDied","Data":"2960f466909598f8c3a987d8d30178fcf3886acbfd3a59d4805714ebdb30ea40"} Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.525911 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5"] Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.532000 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.542100 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5"] Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.626650 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.627047 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.627075 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp4j2\" (UniqueName: \"kubernetes.io/projected/a91476fa-3559-46e8-a2de-14d0d36d2cad-kube-api-access-mp4j2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.707700 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.727949 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.727993 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.728024 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp4j2\" (UniqueName: \"kubernetes.io/projected/a91476fa-3559-46e8-a2de-14d0d36d2cad-kube-api-access-mp4j2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.728460 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.728509 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.787861 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp4j2\" (UniqueName: \"kubernetes.io/projected/a91476fa-3559-46e8-a2de-14d0d36d2cad-kube-api-access-mp4j2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.829068 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-util\") pod \"94d58122-e78f-4369-937c-ba79bf5eec59\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.829158 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv7pj\" (UniqueName: \"kubernetes.io/projected/94d58122-e78f-4369-937c-ba79bf5eec59-kube-api-access-xv7pj\") pod \"94d58122-e78f-4369-937c-ba79bf5eec59\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.829182 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-bundle\") pod \"94d58122-e78f-4369-937c-ba79bf5eec59\" (UID: \"94d58122-e78f-4369-937c-ba79bf5eec59\") " Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.830275 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-bundle" (OuterVolumeSpecName: "bundle") pod "94d58122-e78f-4369-937c-ba79bf5eec59" (UID: "94d58122-e78f-4369-937c-ba79bf5eec59"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.833870 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d58122-e78f-4369-937c-ba79bf5eec59-kube-api-access-xv7pj" (OuterVolumeSpecName: "kube-api-access-xv7pj") pod "94d58122-e78f-4369-937c-ba79bf5eec59" (UID: "94d58122-e78f-4369-937c-ba79bf5eec59"). InnerVolumeSpecName "kube-api-access-xv7pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.842402 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-util" (OuterVolumeSpecName: "util") pod "94d58122-e78f-4369-937c-ba79bf5eec59" (UID: "94d58122-e78f-4369-937c-ba79bf5eec59"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.856851 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.930351 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-util\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.930390 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv7pj\" (UniqueName: \"kubernetes.io/projected/94d58122-e78f-4369-937c-ba79bf5eec59-kube-api-access-xv7pj\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:52 crc kubenswrapper[4906]: I0221 00:17:52.930402 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94d58122-e78f-4369-937c-ba79bf5eec59-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:17:53 crc kubenswrapper[4906]: I0221 00:17:53.352583 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5"] Feb 21 00:17:53 crc kubenswrapper[4906]: W0221 00:17:53.357309 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda91476fa_3559_46e8_a2de_14d0d36d2cad.slice/crio-65de2b82707656955323bfb833dc63fe4a473709ba6b3ebc1e2166891279eca2 WatchSource:0}: Error finding container 65de2b82707656955323bfb833dc63fe4a473709ba6b3ebc1e2166891279eca2: Status 404 returned error can't find the container with id 65de2b82707656955323bfb833dc63fe4a473709ba6b3ebc1e2166891279eca2 Feb 21 00:17:53 crc kubenswrapper[4906]: I0221 00:17:53.432256 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" event={"ID":"a91476fa-3559-46e8-a2de-14d0d36d2cad","Type":"ContainerStarted","Data":"65de2b82707656955323bfb833dc63fe4a473709ba6b3ebc1e2166891279eca2"} Feb 21 00:17:53 crc kubenswrapper[4906]: I0221 00:17:53.434731 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" event={"ID":"94d58122-e78f-4369-937c-ba79bf5eec59","Type":"ContainerDied","Data":"fa199d98467cf154ffa2bcc9aabf81235aa3a02143ab5d4d2879a7d515235846"} Feb 21 00:17:53 crc kubenswrapper[4906]: I0221 00:17:53.434787 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa199d98467cf154ffa2bcc9aabf81235aa3a02143ab5d4d2879a7d515235846" Feb 21 00:17:53 crc kubenswrapper[4906]: I0221 00:17:53.434795 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg" Feb 21 00:17:54 crc kubenswrapper[4906]: I0221 00:17:54.440406 4906 generic.go:334] "Generic (PLEG): container finished" podID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerID="1a06e954eda154383ac24758b4908f2734d59662f9cbd1302a72e8ba42672d21" exitCode=0 Feb 21 00:17:54 crc kubenswrapper[4906]: I0221 00:17:54.440457 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" event={"ID":"a91476fa-3559-46e8-a2de-14d0d36d2cad","Type":"ContainerDied","Data":"1a06e954eda154383ac24758b4908f2734d59662f9cbd1302a72e8ba42672d21"} Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.157320 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7"] Feb 21 00:17:59 crc kubenswrapper[4906]: E0221 00:17:59.157891 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d58122-e78f-4369-937c-ba79bf5eec59" containerName="extract" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.157904 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d58122-e78f-4369-937c-ba79bf5eec59" containerName="extract" Feb 21 00:17:59 crc kubenswrapper[4906]: E0221 00:17:59.157920 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d58122-e78f-4369-937c-ba79bf5eec59" containerName="pull" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.157926 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d58122-e78f-4369-937c-ba79bf5eec59" containerName="pull" Feb 21 00:17:59 crc kubenswrapper[4906]: E0221 00:17:59.157945 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d58122-e78f-4369-937c-ba79bf5eec59" containerName="util" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.157952 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d58122-e78f-4369-937c-ba79bf5eec59" containerName="util" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.158050 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d58122-e78f-4369-937c-ba79bf5eec59" containerName="extract" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.158432 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.160157 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-vsbqc" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.160329 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.160909 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.180897 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.275412 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m648g\" (UniqueName: \"kubernetes.io/projected/8600dc47-ea8b-4033-b9cc-fbb62f54e36e-kube-api-access-m648g\") pod \"obo-prometheus-operator-68bc856cb9-7lwz7\" (UID: \"8600dc47-ea8b-4033-b9cc-fbb62f54e36e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.330784 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.331589 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.333552 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.333887 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-6sl8h" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.350154 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.351036 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.354904 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.360634 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.376890 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m648g\" (UniqueName: \"kubernetes.io/projected/8600dc47-ea8b-4033-b9cc-fbb62f54e36e-kube-api-access-m648g\") pod \"obo-prometheus-operator-68bc856cb9-7lwz7\" (UID: \"8600dc47-ea8b-4033-b9cc-fbb62f54e36e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.398273 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m648g\" (UniqueName: \"kubernetes.io/projected/8600dc47-ea8b-4033-b9cc-fbb62f54e36e-kube-api-access-m648g\") pod \"obo-prometheus-operator-68bc856cb9-7lwz7\" (UID: \"8600dc47-ea8b-4033-b9cc-fbb62f54e36e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.474365 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.477518 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b36dc052-4396-45b6-9169-1efee41b5e32-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-2k8rf\" (UID: \"b36dc052-4396-45b6-9169-1efee41b5e32\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.477564 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93269518-8ff5-4e82-8f60-1a3382b87720-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-bcvsj\" (UID: \"93269518-8ff5-4e82-8f60-1a3382b87720\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.477612 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b36dc052-4396-45b6-9169-1efee41b5e32-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-2k8rf\" (UID: \"b36dc052-4396-45b6-9169-1efee41b5e32\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.477676 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93269518-8ff5-4e82-8f60-1a3382b87720-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-bcvsj\" (UID: \"93269518-8ff5-4e82-8f60-1a3382b87720\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.489168 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-khgkm"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.490020 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.496955 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-lf4gd" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.498814 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-khgkm"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.501446 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.581011 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-xbkjz"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.581517 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93269518-8ff5-4e82-8f60-1a3382b87720-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-bcvsj\" (UID: \"93269518-8ff5-4e82-8f60-1a3382b87720\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.581560 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b36dc052-4396-45b6-9169-1efee41b5e32-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-2k8rf\" (UID: \"b36dc052-4396-45b6-9169-1efee41b5e32\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.581592 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93269518-8ff5-4e82-8f60-1a3382b87720-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-bcvsj\" (UID: \"93269518-8ff5-4e82-8f60-1a3382b87720\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.581640 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b36dc052-4396-45b6-9169-1efee41b5e32-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-2k8rf\" (UID: \"b36dc052-4396-45b6-9169-1efee41b5e32\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.581805 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.584608 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-lljk5" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.585847 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b36dc052-4396-45b6-9169-1efee41b5e32-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-2k8rf\" (UID: \"b36dc052-4396-45b6-9169-1efee41b5e32\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.586381 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93269518-8ff5-4e82-8f60-1a3382b87720-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-bcvsj\" (UID: \"93269518-8ff5-4e82-8f60-1a3382b87720\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.589693 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93269518-8ff5-4e82-8f60-1a3382b87720-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-bcvsj\" (UID: \"93269518-8ff5-4e82-8f60-1a3382b87720\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.594405 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-xbkjz"] Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.601162 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b36dc052-4396-45b6-9169-1efee41b5e32-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86d9894984-2k8rf\" (UID: \"b36dc052-4396-45b6-9169-1efee41b5e32\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.644477 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.671076 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.682706 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc85b339-0a0f-4745-88fa-8b48af27f6be-observability-operator-tls\") pod \"observability-operator-59bdc8b94-khgkm\" (UID: \"bc85b339-0a0f-4745-88fa-8b48af27f6be\") " pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.682848 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktkf6\" (UniqueName: \"kubernetes.io/projected/bc85b339-0a0f-4745-88fa-8b48af27f6be-kube-api-access-ktkf6\") pod \"observability-operator-59bdc8b94-khgkm\" (UID: \"bc85b339-0a0f-4745-88fa-8b48af27f6be\") " pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.682891 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt29z\" (UniqueName: \"kubernetes.io/projected/34cdaea4-84d8-4753-be71-0b73d7e9d1ba-kube-api-access-vt29z\") pod \"perses-operator-5bf474d74f-xbkjz\" (UID: \"34cdaea4-84d8-4753-be71-0b73d7e9d1ba\") " pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.682915 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/34cdaea4-84d8-4753-be71-0b73d7e9d1ba-openshift-service-ca\") pod \"perses-operator-5bf474d74f-xbkjz\" (UID: \"34cdaea4-84d8-4753-be71-0b73d7e9d1ba\") " pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.784267 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktkf6\" (UniqueName: \"kubernetes.io/projected/bc85b339-0a0f-4745-88fa-8b48af27f6be-kube-api-access-ktkf6\") pod \"observability-operator-59bdc8b94-khgkm\" (UID: \"bc85b339-0a0f-4745-88fa-8b48af27f6be\") " pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.784334 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt29z\" (UniqueName: \"kubernetes.io/projected/34cdaea4-84d8-4753-be71-0b73d7e9d1ba-kube-api-access-vt29z\") pod \"perses-operator-5bf474d74f-xbkjz\" (UID: \"34cdaea4-84d8-4753-be71-0b73d7e9d1ba\") " pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.784364 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/34cdaea4-84d8-4753-be71-0b73d7e9d1ba-openshift-service-ca\") pod \"perses-operator-5bf474d74f-xbkjz\" (UID: \"34cdaea4-84d8-4753-be71-0b73d7e9d1ba\") " pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.784403 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc85b339-0a0f-4745-88fa-8b48af27f6be-observability-operator-tls\") pod \"observability-operator-59bdc8b94-khgkm\" (UID: \"bc85b339-0a0f-4745-88fa-8b48af27f6be\") " pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.785454 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/34cdaea4-84d8-4753-be71-0b73d7e9d1ba-openshift-service-ca\") pod \"perses-operator-5bf474d74f-xbkjz\" (UID: \"34cdaea4-84d8-4753-be71-0b73d7e9d1ba\") " pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.787652 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc85b339-0a0f-4745-88fa-8b48af27f6be-observability-operator-tls\") pod \"observability-operator-59bdc8b94-khgkm\" (UID: \"bc85b339-0a0f-4745-88fa-8b48af27f6be\") " pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.804312 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt29z\" (UniqueName: \"kubernetes.io/projected/34cdaea4-84d8-4753-be71-0b73d7e9d1ba-kube-api-access-vt29z\") pod \"perses-operator-5bf474d74f-xbkjz\" (UID: \"34cdaea4-84d8-4753-be71-0b73d7e9d1ba\") " pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.806969 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktkf6\" (UniqueName: \"kubernetes.io/projected/bc85b339-0a0f-4745-88fa-8b48af27f6be-kube-api-access-ktkf6\") pod \"observability-operator-59bdc8b94-khgkm\" (UID: \"bc85b339-0a0f-4745-88fa-8b48af27f6be\") " pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.807311 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:17:59 crc kubenswrapper[4906]: I0221 00:17:59.933242 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.303320 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7"] Feb 21 00:18:01 crc kubenswrapper[4906]: W0221 00:18:01.331075 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8600dc47_ea8b_4033_b9cc_fbb62f54e36e.slice/crio-0200a496093b5a4c739febce2c0e67b7ca930a3957b928815a5786f2316946b1 WatchSource:0}: Error finding container 0200a496093b5a4c739febce2c0e67b7ca930a3957b928815a5786f2316946b1: Status 404 returned error can't find the container with id 0200a496093b5a4c739febce2c0e67b7ca930a3957b928815a5786f2316946b1 Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.420416 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj"] Feb 21 00:18:01 crc kubenswrapper[4906]: W0221 00:18:01.425829 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93269518_8ff5_4e82_8f60_1a3382b87720.slice/crio-96fd212cebb43604333ce7fba4811ba45ed7fcc2f17368c00a37396f4c3e5abe WatchSource:0}: Error finding container 96fd212cebb43604333ce7fba4811ba45ed7fcc2f17368c00a37396f4c3e5abe: Status 404 returned error can't find the container with id 96fd212cebb43604333ce7fba4811ba45ed7fcc2f17368c00a37396f4c3e5abe Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.443528 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-khgkm"] Feb 21 00:18:01 crc kubenswrapper[4906]: W0221 00:18:01.448210 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc85b339_0a0f_4745_88fa_8b48af27f6be.slice/crio-68fe8146e7ea0c76f271128dd3772b33999bf9adae2989340fab2b0c0517f34d WatchSource:0}: Error finding container 68fe8146e7ea0c76f271128dd3772b33999bf9adae2989340fab2b0c0517f34d: Status 404 returned error can't find the container with id 68fe8146e7ea0c76f271128dd3772b33999bf9adae2989340fab2b0c0517f34d Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.486818 4906 generic.go:334] "Generic (PLEG): container finished" podID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerID="4e9f04257a53b9a72b16b2b315add2df2ff3b95a1922fa4d26257860b37f255b" exitCode=0 Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.486919 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" event={"ID":"a91476fa-3559-46e8-a2de-14d0d36d2cad","Type":"ContainerDied","Data":"4e9f04257a53b9a72b16b2b315add2df2ff3b95a1922fa4d26257860b37f255b"} Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.493070 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7" event={"ID":"8600dc47-ea8b-4033-b9cc-fbb62f54e36e","Type":"ContainerStarted","Data":"0200a496093b5a4c739febce2c0e67b7ca930a3957b928815a5786f2316946b1"} Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.495107 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-khgkm" event={"ID":"bc85b339-0a0f-4745-88fa-8b48af27f6be","Type":"ContainerStarted","Data":"68fe8146e7ea0c76f271128dd3772b33999bf9adae2989340fab2b0c0517f34d"} Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.496709 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" event={"ID":"93269518-8ff5-4e82-8f60-1a3382b87720","Type":"ContainerStarted","Data":"96fd212cebb43604333ce7fba4811ba45ed7fcc2f17368c00a37396f4c3e5abe"} Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.551100 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-xbkjz"] Feb 21 00:18:01 crc kubenswrapper[4906]: W0221 00:18:01.557916 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb36dc052_4396_45b6_9169_1efee41b5e32.slice/crio-23d969517950fcd26144c222740d520f3b0e7025d7f832f207b3bea91a0644bb WatchSource:0}: Error finding container 23d969517950fcd26144c222740d520f3b0e7025d7f832f207b3bea91a0644bb: Status 404 returned error can't find the container with id 23d969517950fcd26144c222740d520f3b0e7025d7f832f207b3bea91a0644bb Feb 21 00:18:01 crc kubenswrapper[4906]: I0221 00:18:01.561082 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf"] Feb 21 00:18:01 crc kubenswrapper[4906]: W0221 00:18:01.565093 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cdaea4_84d8_4753_be71_0b73d7e9d1ba.slice/crio-c0bca3caabf13d86a93dfe37e1445ad14e9b6c7b8fdbb4c72cc1cef9601ef55d WatchSource:0}: Error finding container c0bca3caabf13d86a93dfe37e1445ad14e9b6c7b8fdbb4c72cc1cef9601ef55d: Status 404 returned error can't find the container with id c0bca3caabf13d86a93dfe37e1445ad14e9b6c7b8fdbb4c72cc1cef9601ef55d Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.428990 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-857cb8f45c-lpdp7"] Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.429952 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.431601 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.431795 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.432680 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.433478 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-vks55" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.451953 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-857cb8f45c-lpdp7"] Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.516245 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/387daace-d19a-4663-a40e-f2d3dca44f01-apiservice-cert\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.516298 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/387daace-d19a-4663-a40e-f2d3dca44f01-webhook-cert\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.516354 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjgqd\" (UniqueName: \"kubernetes.io/projected/387daace-d19a-4663-a40e-f2d3dca44f01-kube-api-access-kjgqd\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.528489 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" event={"ID":"34cdaea4-84d8-4753-be71-0b73d7e9d1ba","Type":"ContainerStarted","Data":"c0bca3caabf13d86a93dfe37e1445ad14e9b6c7b8fdbb4c72cc1cef9601ef55d"} Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.543950 4906 generic.go:334] "Generic (PLEG): container finished" podID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerID="e003ca8eb352d454bb6c048a30ccb1e905eb955b48f225c3fa79833572af3bc8" exitCode=0 Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.544215 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" event={"ID":"a91476fa-3559-46e8-a2de-14d0d36d2cad","Type":"ContainerDied","Data":"e003ca8eb352d454bb6c048a30ccb1e905eb955b48f225c3fa79833572af3bc8"} Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.557869 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" event={"ID":"b36dc052-4396-45b6-9169-1efee41b5e32","Type":"ContainerStarted","Data":"23d969517950fcd26144c222740d520f3b0e7025d7f832f207b3bea91a0644bb"} Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.617366 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/387daace-d19a-4663-a40e-f2d3dca44f01-webhook-cert\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.617737 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjgqd\" (UniqueName: \"kubernetes.io/projected/387daace-d19a-4663-a40e-f2d3dca44f01-kube-api-access-kjgqd\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.618252 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/387daace-d19a-4663-a40e-f2d3dca44f01-apiservice-cert\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.643266 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/387daace-d19a-4663-a40e-f2d3dca44f01-webhook-cert\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.646624 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjgqd\" (UniqueName: \"kubernetes.io/projected/387daace-d19a-4663-a40e-f2d3dca44f01-kube-api-access-kjgqd\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.650701 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/387daace-d19a-4663-a40e-f2d3dca44f01-apiservice-cert\") pod \"elastic-operator-857cb8f45c-lpdp7\" (UID: \"387daace-d19a-4663-a40e-f2d3dca44f01\") " pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:02 crc kubenswrapper[4906]: I0221 00:18:02.751170 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" Feb 21 00:18:03 crc kubenswrapper[4906]: I0221 00:18:03.085753 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-857cb8f45c-lpdp7"] Feb 21 00:18:03 crc kubenswrapper[4906]: W0221 00:18:03.126672 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod387daace_d19a_4663_a40e_f2d3dca44f01.slice/crio-68d978a7c55f9cef6b9e73dbc4b604cc6139adad7b598631ee255001fe47ea5f WatchSource:0}: Error finding container 68d978a7c55f9cef6b9e73dbc4b604cc6139adad7b598631ee255001fe47ea5f: Status 404 returned error can't find the container with id 68d978a7c55f9cef6b9e73dbc4b604cc6139adad7b598631ee255001fe47ea5f Feb 21 00:18:03 crc kubenswrapper[4906]: I0221 00:18:03.566581 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" event={"ID":"387daace-d19a-4663-a40e-f2d3dca44f01","Type":"ContainerStarted","Data":"68d978a7c55f9cef6b9e73dbc4b604cc6139adad7b598631ee255001fe47ea5f"} Feb 21 00:18:03 crc kubenswrapper[4906]: I0221 00:18:03.897280 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.051754 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-util\") pod \"a91476fa-3559-46e8-a2de-14d0d36d2cad\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.051835 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp4j2\" (UniqueName: \"kubernetes.io/projected/a91476fa-3559-46e8-a2de-14d0d36d2cad-kube-api-access-mp4j2\") pod \"a91476fa-3559-46e8-a2de-14d0d36d2cad\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.051889 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-bundle\") pod \"a91476fa-3559-46e8-a2de-14d0d36d2cad\" (UID: \"a91476fa-3559-46e8-a2de-14d0d36d2cad\") " Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.052872 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-bundle" (OuterVolumeSpecName: "bundle") pod "a91476fa-3559-46e8-a2de-14d0d36d2cad" (UID: "a91476fa-3559-46e8-a2de-14d0d36d2cad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.059175 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91476fa-3559-46e8-a2de-14d0d36d2cad-kube-api-access-mp4j2" (OuterVolumeSpecName: "kube-api-access-mp4j2") pod "a91476fa-3559-46e8-a2de-14d0d36d2cad" (UID: "a91476fa-3559-46e8-a2de-14d0d36d2cad"). InnerVolumeSpecName "kube-api-access-mp4j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.066327 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-util" (OuterVolumeSpecName: "util") pod "a91476fa-3559-46e8-a2de-14d0d36d2cad" (UID: "a91476fa-3559-46e8-a2de-14d0d36d2cad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.153557 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp4j2\" (UniqueName: \"kubernetes.io/projected/a91476fa-3559-46e8-a2de-14d0d36d2cad-kube-api-access-mp4j2\") on node \"crc\" DevicePath \"\"" Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.153586 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.153595 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a91476fa-3559-46e8-a2de-14d0d36d2cad-util\") on node \"crc\" DevicePath \"\"" Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.596398 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" event={"ID":"a91476fa-3559-46e8-a2de-14d0d36d2cad","Type":"ContainerDied","Data":"65de2b82707656955323bfb833dc63fe4a473709ba6b3ebc1e2166891279eca2"} Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.596453 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65de2b82707656955323bfb833dc63fe4a473709ba6b3ebc1e2166891279eca2" Feb 21 00:18:04 crc kubenswrapper[4906]: I0221 00:18:04.596527 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.400163 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9"] Feb 21 00:18:16 crc kubenswrapper[4906]: E0221 00:18:16.401106 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerName="util" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.401130 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerName="util" Feb 21 00:18:16 crc kubenswrapper[4906]: E0221 00:18:16.401145 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerName="pull" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.401156 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerName="pull" Feb 21 00:18:16 crc kubenswrapper[4906]: E0221 00:18:16.401182 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerName="extract" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.401193 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerName="extract" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.401335 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91476fa-3559-46e8-a2de-14d0d36d2cad" containerName="extract" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.402027 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.406124 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.406154 4906 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-4tvhv" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.406332 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.432557 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9"] Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.529870 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2a146a1-78fb-4ee1-ad93-4facd4b687b0-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-4v8z9\" (UID: \"c2a146a1-78fb-4ee1-ad93-4facd4b687b0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.530343 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxz8\" (UniqueName: \"kubernetes.io/projected/c2a146a1-78fb-4ee1-ad93-4facd4b687b0-kube-api-access-6xxz8\") pod \"cert-manager-operator-controller-manager-5586865c96-4v8z9\" (UID: \"c2a146a1-78fb-4ee1-ad93-4facd4b687b0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.631024 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxz8\" (UniqueName: \"kubernetes.io/projected/c2a146a1-78fb-4ee1-ad93-4facd4b687b0-kube-api-access-6xxz8\") pod \"cert-manager-operator-controller-manager-5586865c96-4v8z9\" (UID: \"c2a146a1-78fb-4ee1-ad93-4facd4b687b0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.631068 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2a146a1-78fb-4ee1-ad93-4facd4b687b0-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-4v8z9\" (UID: \"c2a146a1-78fb-4ee1-ad93-4facd4b687b0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.631449 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2a146a1-78fb-4ee1-ad93-4facd4b687b0-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-4v8z9\" (UID: \"c2a146a1-78fb-4ee1-ad93-4facd4b687b0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.650607 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxz8\" (UniqueName: \"kubernetes.io/projected/c2a146a1-78fb-4ee1-ad93-4facd4b687b0-kube-api-access-6xxz8\") pod \"cert-manager-operator-controller-manager-5586865c96-4v8z9\" (UID: \"c2a146a1-78fb-4ee1-ad93-4facd4b687b0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" Feb 21 00:18:16 crc kubenswrapper[4906]: I0221 00:18:16.728541 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.683601 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" event={"ID":"387daace-d19a-4663-a40e-f2d3dca44f01","Type":"ContainerStarted","Data":"8a7cb7b3d868aa762799d88c95ded3c43c90fc2eb8a74ece7926c6aab4122957"} Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.686105 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" event={"ID":"93269518-8ff5-4e82-8f60-1a3382b87720","Type":"ContainerStarted","Data":"dd0f9f0ae5418ec3c2a0221227de148e073083617922ee286a90a868c10af402"} Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.691467 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7" event={"ID":"8600dc47-ea8b-4033-b9cc-fbb62f54e36e","Type":"ContainerStarted","Data":"de42850300186f19f0caaeba709f55a5c350bea4c699fd793d5b6063be872e84"} Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.693943 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-khgkm" event={"ID":"bc85b339-0a0f-4745-88fa-8b48af27f6be","Type":"ContainerStarted","Data":"f1cbd6b28bb5f98026a9d668b4a461a777aa493bb6af1cade59c08349d0d1526"} Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.694615 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.695270 4906 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-khgkm container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.41:8081/healthz\": dial tcp 10.217.0.41:8081: connect: connection refused" start-of-body= Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.695303 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-khgkm" podUID="bc85b339-0a0f-4745-88fa-8b48af27f6be" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.41:8081/healthz\": dial tcp 10.217.0.41:8081: connect: connection refused" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.697302 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" event={"ID":"b36dc052-4396-45b6-9169-1efee41b5e32","Type":"ContainerStarted","Data":"2cb65a6dff6dc915a1dc74aec3ed5705773be59b9c5e68e17fe675daa8064f23"} Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.698879 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" event={"ID":"34cdaea4-84d8-4753-be71-0b73d7e9d1ba","Type":"ContainerStarted","Data":"2483b3e86fe729dc1037b204d14562c4d840390cc4e8a6ba68393b406aeee146"} Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.699392 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.737211 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-857cb8f45c-lpdp7" podStartSLOduration=1.625808294 podStartE2EDuration="16.737193411s" podCreationTimestamp="2026-02-21 00:18:02 +0000 UTC" firstStartedPulling="2026-02-21 00:18:03.174724496 +0000 UTC m=+618.426312002" lastFinishedPulling="2026-02-21 00:18:18.286109603 +0000 UTC m=+633.537697119" observedRunningTime="2026-02-21 00:18:18.710210362 +0000 UTC m=+633.961797868" watchObservedRunningTime="2026-02-21 00:18:18.737193411 +0000 UTC m=+633.988780917" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.738815 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-bcvsj" podStartSLOduration=2.8849984170000003 podStartE2EDuration="19.738809898s" podCreationTimestamp="2026-02-21 00:17:59 +0000 UTC" firstStartedPulling="2026-02-21 00:18:01.428193782 +0000 UTC m=+616.679781288" lastFinishedPulling="2026-02-21 00:18:18.282005263 +0000 UTC m=+633.533592769" observedRunningTime="2026-02-21 00:18:18.735316586 +0000 UTC m=+633.986904112" watchObservedRunningTime="2026-02-21 00:18:18.738809898 +0000 UTC m=+633.990397404" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.743535 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9"] Feb 21 00:18:18 crc kubenswrapper[4906]: W0221 00:18:18.748783 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a146a1_78fb_4ee1_ad93_4facd4b687b0.slice/crio-ab89d3be7537725e4887d2477ef6973aa7dd8f450ed01644565f841edf2a4b05 WatchSource:0}: Error finding container ab89d3be7537725e4887d2477ef6973aa7dd8f450ed01644565f841edf2a4b05: Status 404 returned error can't find the container with id ab89d3be7537725e4887d2477ef6973aa7dd8f450ed01644565f841edf2a4b05 Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.768198 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-khgkm" podStartSLOduration=2.8697227 podStartE2EDuration="19.768182577s" podCreationTimestamp="2026-02-21 00:17:59 +0000 UTC" firstStartedPulling="2026-02-21 00:18:01.452169753 +0000 UTC m=+616.703757259" lastFinishedPulling="2026-02-21 00:18:18.35062963 +0000 UTC m=+633.602217136" observedRunningTime="2026-02-21 00:18:18.766897079 +0000 UTC m=+634.018484585" watchObservedRunningTime="2026-02-21 00:18:18.768182577 +0000 UTC m=+634.019770073" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.791345 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7lwz7" podStartSLOduration=2.840460976 podStartE2EDuration="19.791330014s" podCreationTimestamp="2026-02-21 00:17:59 +0000 UTC" firstStartedPulling="2026-02-21 00:18:01.334347389 +0000 UTC m=+616.585934905" lastFinishedPulling="2026-02-21 00:18:18.285216437 +0000 UTC m=+633.536803943" observedRunningTime="2026-02-21 00:18:18.790958673 +0000 UTC m=+634.042546179" watchObservedRunningTime="2026-02-21 00:18:18.791330014 +0000 UTC m=+634.042917520" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.806433 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" podStartSLOduration=3.027967628 podStartE2EDuration="19.806415205s" podCreationTimestamp="2026-02-21 00:17:59 +0000 UTC" firstStartedPulling="2026-02-21 00:18:01.572227784 +0000 UTC m=+616.823815300" lastFinishedPulling="2026-02-21 00:18:18.350675371 +0000 UTC m=+633.602262877" observedRunningTime="2026-02-21 00:18:18.805725135 +0000 UTC m=+634.057312641" watchObservedRunningTime="2026-02-21 00:18:18.806415205 +0000 UTC m=+634.058002711" Feb 21 00:18:18 crc kubenswrapper[4906]: I0221 00:18:18.830406 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86d9894984-2k8rf" podStartSLOduration=3.104913187 podStartE2EDuration="19.830392126s" podCreationTimestamp="2026-02-21 00:17:59 +0000 UTC" firstStartedPulling="2026-02-21 00:18:01.560391097 +0000 UTC m=+616.811978613" lastFinishedPulling="2026-02-21 00:18:18.285870036 +0000 UTC m=+633.537457552" observedRunningTime="2026-02-21 00:18:18.826798221 +0000 UTC m=+634.078385747" watchObservedRunningTime="2026-02-21 00:18:18.830392126 +0000 UTC m=+634.081979632" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.709993 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" event={"ID":"c2a146a1-78fb-4ee1-ad93-4facd4b687b0","Type":"ContainerStarted","Data":"ab89d3be7537725e4887d2477ef6973aa7dd8f450ed01644565f841edf2a4b05"} Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.712741 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-khgkm" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.793263 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.794475 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.794659 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.799218 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.799579 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.799867 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.800142 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-bjpz5" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.800373 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.800536 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.800727 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.800883 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.800908 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.930884 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.931220 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.931321 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.931465 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.931556 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.931812 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.931911 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.932002 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.932077 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.932139 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.932232 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.932343 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/674b1546-6063-4c50-bcd5-062ce5dc9637-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.932454 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.932552 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:19 crc kubenswrapper[4906]: I0221 00:18:19.932655 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.034033 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035145 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035261 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035358 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035489 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035567 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035656 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035740 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035830 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035928 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035997 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.036062 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.035746 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.036217 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/674b1546-6063-4c50-bcd5-062ce5dc9637-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.036280 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.036344 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.036395 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.036434 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.037544 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.038155 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.038334 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/674b1546-6063-4c50-bcd5-062ce5dc9637-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.038448 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.039198 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.040763 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/674b1546-6063-4c50-bcd5-062ce5dc9637-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.041357 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.041478 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.041543 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.041892 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.041890 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.059913 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/674b1546-6063-4c50-bcd5-062ce5dc9637-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"674b1546-6063-4c50-bcd5-062ce5dc9637\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.116542 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.403675 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 21 00:18:20 crc kubenswrapper[4906]: I0221 00:18:20.715129 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"674b1546-6063-4c50-bcd5-062ce5dc9637","Type":"ContainerStarted","Data":"90ef27de6232172d6a57eea3f2eeae924ffd416f60b1edefa581329cdf861df8"} Feb 21 00:18:22 crc kubenswrapper[4906]: I0221 00:18:22.726973 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" event={"ID":"c2a146a1-78fb-4ee1-ad93-4facd4b687b0","Type":"ContainerStarted","Data":"6184c54d9142a2536b6c289317d974d589e079cb71b1cdb516fd176730cac50f"} Feb 21 00:18:22 crc kubenswrapper[4906]: I0221 00:18:22.742145 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-4v8z9" podStartSLOduration=3.526163617 podStartE2EDuration="6.742131674s" podCreationTimestamp="2026-02-21 00:18:16 +0000 UTC" firstStartedPulling="2026-02-21 00:18:18.759658438 +0000 UTC m=+634.011245944" lastFinishedPulling="2026-02-21 00:18:21.975626495 +0000 UTC m=+637.227214001" observedRunningTime="2026-02-21 00:18:22.740657331 +0000 UTC m=+637.992244837" watchObservedRunningTime="2026-02-21 00:18:22.742131674 +0000 UTC m=+637.993719180" Feb 21 00:18:25 crc kubenswrapper[4906]: I0221 00:18:25.948408 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-fvzcd"] Feb 21 00:18:25 crc kubenswrapper[4906]: I0221 00:18:25.949509 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:25 crc kubenswrapper[4906]: I0221 00:18:25.998005 4906 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-sp8xs" Feb 21 00:18:25 crc kubenswrapper[4906]: I0221 00:18:25.998072 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 21 00:18:25 crc kubenswrapper[4906]: I0221 00:18:25.998011 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.001792 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-fvzcd"] Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.112020 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqpt\" (UniqueName: \"kubernetes.io/projected/8c4cfe95-3d03-4fc5-a750-8e7580361fa2-kube-api-access-fkqpt\") pod \"cert-manager-webhook-6888856db4-fvzcd\" (UID: \"8c4cfe95-3d03-4fc5-a750-8e7580361fa2\") " pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.112468 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c4cfe95-3d03-4fc5-a750-8e7580361fa2-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-fvzcd\" (UID: \"8c4cfe95-3d03-4fc5-a750-8e7580361fa2\") " pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.213304 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c4cfe95-3d03-4fc5-a750-8e7580361fa2-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-fvzcd\" (UID: \"8c4cfe95-3d03-4fc5-a750-8e7580361fa2\") " pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.213380 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqpt\" (UniqueName: \"kubernetes.io/projected/8c4cfe95-3d03-4fc5-a750-8e7580361fa2-kube-api-access-fkqpt\") pod \"cert-manager-webhook-6888856db4-fvzcd\" (UID: \"8c4cfe95-3d03-4fc5-a750-8e7580361fa2\") " pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.239965 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqpt\" (UniqueName: \"kubernetes.io/projected/8c4cfe95-3d03-4fc5-a750-8e7580361fa2-kube-api-access-fkqpt\") pod \"cert-manager-webhook-6888856db4-fvzcd\" (UID: \"8c4cfe95-3d03-4fc5-a750-8e7580361fa2\") " pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.258964 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c4cfe95-3d03-4fc5-a750-8e7580361fa2-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-fvzcd\" (UID: \"8c4cfe95-3d03-4fc5-a750-8e7580361fa2\") " pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.329114 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.577218 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-fvzcd"] Feb 21 00:18:26 crc kubenswrapper[4906]: I0221 00:18:26.761716 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" event={"ID":"8c4cfe95-3d03-4fc5-a750-8e7580361fa2","Type":"ContainerStarted","Data":"c96b5cd253a801199e7baef68fc9c6ea75942b028328d3f74d6148722a27197f"} Feb 21 00:18:28 crc kubenswrapper[4906]: I0221 00:18:28.812074 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-jlzgl"] Feb 21 00:18:28 crc kubenswrapper[4906]: I0221 00:18:28.813053 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" Feb 21 00:18:28 crc kubenswrapper[4906]: I0221 00:18:28.817413 4906 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9vdk7" Feb 21 00:18:28 crc kubenswrapper[4906]: I0221 00:18:28.821264 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-jlzgl"] Feb 21 00:18:28 crc kubenswrapper[4906]: I0221 00:18:28.955090 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc213c0-dc6e-4284-b3ef-4315fce21686-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-jlzgl\" (UID: \"6cc213c0-dc6e-4284-b3ef-4315fce21686\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" Feb 21 00:18:28 crc kubenswrapper[4906]: I0221 00:18:28.955134 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jms5\" (UniqueName: \"kubernetes.io/projected/6cc213c0-dc6e-4284-b3ef-4315fce21686-kube-api-access-2jms5\") pod \"cert-manager-cainjector-5545bd876-jlzgl\" (UID: \"6cc213c0-dc6e-4284-b3ef-4315fce21686\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" Feb 21 00:18:29 crc kubenswrapper[4906]: I0221 00:18:29.057416 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc213c0-dc6e-4284-b3ef-4315fce21686-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-jlzgl\" (UID: \"6cc213c0-dc6e-4284-b3ef-4315fce21686\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" Feb 21 00:18:29 crc kubenswrapper[4906]: I0221 00:18:29.057465 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jms5\" (UniqueName: \"kubernetes.io/projected/6cc213c0-dc6e-4284-b3ef-4315fce21686-kube-api-access-2jms5\") pod \"cert-manager-cainjector-5545bd876-jlzgl\" (UID: \"6cc213c0-dc6e-4284-b3ef-4315fce21686\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" Feb 21 00:18:29 crc kubenswrapper[4906]: I0221 00:18:29.082584 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jms5\" (UniqueName: \"kubernetes.io/projected/6cc213c0-dc6e-4284-b3ef-4315fce21686-kube-api-access-2jms5\") pod \"cert-manager-cainjector-5545bd876-jlzgl\" (UID: \"6cc213c0-dc6e-4284-b3ef-4315fce21686\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" Feb 21 00:18:29 crc kubenswrapper[4906]: I0221 00:18:29.103476 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc213c0-dc6e-4284-b3ef-4315fce21686-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-jlzgl\" (UID: \"6cc213c0-dc6e-4284-b3ef-4315fce21686\") " pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" Feb 21 00:18:29 crc kubenswrapper[4906]: I0221 00:18:29.131477 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" Feb 21 00:18:29 crc kubenswrapper[4906]: I0221 00:18:29.379142 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-jlzgl"] Feb 21 00:18:29 crc kubenswrapper[4906]: I0221 00:18:29.779185 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" event={"ID":"6cc213c0-dc6e-4284-b3ef-4315fce21686","Type":"ContainerStarted","Data":"410f131c530a5a00ae7e4d8fc1d2d6b76d9e6e2d8885398ec0b391d5d2a2d980"} Feb 21 00:18:29 crc kubenswrapper[4906]: I0221 00:18:29.936418 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-xbkjz" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.361533 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-sqwxp"] Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.362177 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-sqwxp" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.366862 4906 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dzxmn" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.389204 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-sqwxp"] Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.485155 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbgp8\" (UniqueName: \"kubernetes.io/projected/7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d-kube-api-access-kbgp8\") pod \"cert-manager-545d4d4674-sqwxp\" (UID: \"7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d\") " pod="cert-manager/cert-manager-545d4d4674-sqwxp" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.485227 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d-bound-sa-token\") pod \"cert-manager-545d4d4674-sqwxp\" (UID: \"7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d\") " pod="cert-manager/cert-manager-545d4d4674-sqwxp" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.586170 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbgp8\" (UniqueName: \"kubernetes.io/projected/7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d-kube-api-access-kbgp8\") pod \"cert-manager-545d4d4674-sqwxp\" (UID: \"7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d\") " pod="cert-manager/cert-manager-545d4d4674-sqwxp" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.586243 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d-bound-sa-token\") pod \"cert-manager-545d4d4674-sqwxp\" (UID: \"7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d\") " pod="cert-manager/cert-manager-545d4d4674-sqwxp" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.608826 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbgp8\" (UniqueName: \"kubernetes.io/projected/7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d-kube-api-access-kbgp8\") pod \"cert-manager-545d4d4674-sqwxp\" (UID: \"7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d\") " pod="cert-manager/cert-manager-545d4d4674-sqwxp" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.611735 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d-bound-sa-token\") pod \"cert-manager-545d4d4674-sqwxp\" (UID: \"7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d\") " pod="cert-manager/cert-manager-545d4d4674-sqwxp" Feb 21 00:18:36 crc kubenswrapper[4906]: I0221 00:18:36.693767 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-sqwxp" Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.464349 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-sqwxp"] Feb 21 00:18:40 crc kubenswrapper[4906]: W0221 00:18:40.466300 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7233cb65_44b5_4f88_9aa4_dd37bfdb0f2d.slice/crio-2ca92f0bf33118fc2949ae87a8cde51d134b0e18a45161d85acd59d11d1582dc WatchSource:0}: Error finding container 2ca92f0bf33118fc2949ae87a8cde51d134b0e18a45161d85acd59d11d1582dc: Status 404 returned error can't find the container with id 2ca92f0bf33118fc2949ae87a8cde51d134b0e18a45161d85acd59d11d1582dc Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.853946 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-sqwxp" event={"ID":"7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d","Type":"ContainerStarted","Data":"7f5a5505c7d8a2ab5cf62b17c7bb3270353fefb513a13d59989d20d8abdfb90d"} Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.853991 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-sqwxp" event={"ID":"7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d","Type":"ContainerStarted","Data":"2ca92f0bf33118fc2949ae87a8cde51d134b0e18a45161d85acd59d11d1582dc"} Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.855442 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" event={"ID":"6cc213c0-dc6e-4284-b3ef-4315fce21686","Type":"ContainerStarted","Data":"20a78cde549b14cfe2f8867826513a421119d8fcf4e57679921f79365952a9ae"} Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.856854 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" event={"ID":"8c4cfe95-3d03-4fc5-a750-8e7580361fa2","Type":"ContainerStarted","Data":"d841d5921d2a36edd9f38025b46a2b1aa5a5e8a85bc22467d17ab20052f1d080"} Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.856921 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.858331 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"674b1546-6063-4c50-bcd5-062ce5dc9637","Type":"ContainerStarted","Data":"0c457f55f30cead1452399c90d0cb4bc0a8e655a8f4e604e4b976b47bf5967d5"} Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.871427 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-sqwxp" podStartSLOduration=4.87141053 podStartE2EDuration="4.87141053s" podCreationTimestamp="2026-02-21 00:18:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:18:40.867362542 +0000 UTC m=+656.118950088" watchObservedRunningTime="2026-02-21 00:18:40.87141053 +0000 UTC m=+656.122998046" Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.947938 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-jlzgl" podStartSLOduration=2.574266829 podStartE2EDuration="12.947924487s" podCreationTimestamp="2026-02-21 00:18:28 +0000 UTC" firstStartedPulling="2026-02-21 00:18:29.386363664 +0000 UTC m=+644.637951170" lastFinishedPulling="2026-02-21 00:18:39.760021312 +0000 UTC m=+655.011608828" observedRunningTime="2026-02-21 00:18:40.946361241 +0000 UTC m=+656.197948747" watchObservedRunningTime="2026-02-21 00:18:40.947924487 +0000 UTC m=+656.199511993" Feb 21 00:18:40 crc kubenswrapper[4906]: I0221 00:18:40.968982 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" podStartSLOduration=2.719935766 podStartE2EDuration="15.968963252s" podCreationTimestamp="2026-02-21 00:18:25 +0000 UTC" firstStartedPulling="2026-02-21 00:18:26.599169244 +0000 UTC m=+641.850756760" lastFinishedPulling="2026-02-21 00:18:39.84819674 +0000 UTC m=+655.099784246" observedRunningTime="2026-02-21 00:18:40.963829632 +0000 UTC m=+656.215417148" watchObservedRunningTime="2026-02-21 00:18:40.968963252 +0000 UTC m=+656.220550778" Feb 21 00:18:41 crc kubenswrapper[4906]: I0221 00:18:41.053601 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 21 00:18:41 crc kubenswrapper[4906]: I0221 00:18:41.078361 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 21 00:18:42 crc kubenswrapper[4906]: I0221 00:18:42.871571 4906 generic.go:334] "Generic (PLEG): container finished" podID="674b1546-6063-4c50-bcd5-062ce5dc9637" containerID="0c457f55f30cead1452399c90d0cb4bc0a8e655a8f4e604e4b976b47bf5967d5" exitCode=0 Feb 21 00:18:42 crc kubenswrapper[4906]: I0221 00:18:42.871634 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"674b1546-6063-4c50-bcd5-062ce5dc9637","Type":"ContainerDied","Data":"0c457f55f30cead1452399c90d0cb4bc0a8e655a8f4e604e4b976b47bf5967d5"} Feb 21 00:18:46 crc kubenswrapper[4906]: I0221 00:18:46.528112 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-fvzcd" Feb 21 00:18:46 crc kubenswrapper[4906]: I0221 00:18:46.901305 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"674b1546-6063-4c50-bcd5-062ce5dc9637","Type":"ContainerStarted","Data":"4fb44282178557f4f85ef9257a07aece71b865bb24f808e786b04e4f46f67b02"} Feb 21 00:18:47 crc kubenswrapper[4906]: I0221 00:18:47.910684 4906 generic.go:334] "Generic (PLEG): container finished" podID="674b1546-6063-4c50-bcd5-062ce5dc9637" containerID="4fb44282178557f4f85ef9257a07aece71b865bb24f808e786b04e4f46f67b02" exitCode=0 Feb 21 00:18:47 crc kubenswrapper[4906]: I0221 00:18:47.910806 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"674b1546-6063-4c50-bcd5-062ce5dc9637","Type":"ContainerDied","Data":"4fb44282178557f4f85ef9257a07aece71b865bb24f808e786b04e4f46f67b02"} Feb 21 00:18:48 crc kubenswrapper[4906]: I0221 00:18:48.921099 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"674b1546-6063-4c50-bcd5-062ce5dc9637","Type":"ContainerStarted","Data":"82d5a7b04c289a658a4ab13d65b0fb63c9e262ad6655035193aa92ac316890f3"} Feb 21 00:18:48 crc kubenswrapper[4906]: I0221 00:18:48.922483 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:18:48 crc kubenswrapper[4906]: I0221 00:18:48.975149 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=10.158197492 podStartE2EDuration="29.975128282s" podCreationTimestamp="2026-02-21 00:18:19 +0000 UTC" firstStartedPulling="2026-02-21 00:18:20.422755853 +0000 UTC m=+635.674343359" lastFinishedPulling="2026-02-21 00:18:40.239686633 +0000 UTC m=+655.491274149" observedRunningTime="2026-02-21 00:18:48.973263787 +0000 UTC m=+664.224851323" watchObservedRunningTime="2026-02-21 00:18:48.975128282 +0000 UTC m=+664.226715808" Feb 21 00:19:00 crc kubenswrapper[4906]: I0221 00:19:00.224873 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="674b1546-6063-4c50-bcd5-062ce5dc9637" containerName="elasticsearch" probeResult="failure" output=< Feb 21 00:19:00 crc kubenswrapper[4906]: {"timestamp": "2026-02-21T00:19:00+00:00", "message": "readiness probe failed", "curl_rc": "7"} Feb 21 00:19:00 crc kubenswrapper[4906]: > Feb 21 00:19:05 crc kubenswrapper[4906]: I0221 00:19:05.818547 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.202209 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.203886 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.208039 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.208108 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.208865 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-rw7jx" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.209574 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.211376 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.237901 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.393647 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.393709 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.393731 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.393754 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvpg\" (UniqueName: \"kubernetes.io/projected/c76dc364-84a8-4076-9206-56b74f0287b5-kube-api-access-cbvpg\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.393907 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.393996 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.394045 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.394102 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.394176 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.394223 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.394268 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.394360 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.394459 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.495843 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.495926 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.495981 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496088 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496181 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496225 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496279 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbvpg\" (UniqueName: \"kubernetes.io/projected/c76dc364-84a8-4076-9206-56b74f0287b5-kube-api-access-cbvpg\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496333 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496388 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496428 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496461 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496530 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496574 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.496976 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.497047 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.497239 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.497315 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.497560 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.497884 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.497947 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.498409 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.498434 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.503204 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.505220 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.505967 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.532552 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbvpg\" (UniqueName: \"kubernetes.io/projected/c76dc364-84a8-4076-9206-56b74f0287b5-kube-api-access-cbvpg\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:11 crc kubenswrapper[4906]: I0221 00:19:11.826646 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:12 crc kubenswrapper[4906]: I0221 00:19:12.302459 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Feb 21 00:19:13 crc kubenswrapper[4906]: I0221 00:19:13.102197 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c76dc364-84a8-4076-9206-56b74f0287b5","Type":"ContainerStarted","Data":"6248919b550800d1237893e34f3af49ef742d7357bd9e9c9b09f2d0e3dd6f79b"} Feb 21 00:19:19 crc kubenswrapper[4906]: I0221 00:19:19.141588 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c76dc364-84a8-4076-9206-56b74f0287b5","Type":"ContainerStarted","Data":"1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd"} Feb 21 00:19:19 crc kubenswrapper[4906]: E0221 00:19:19.213952 4906 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3717330381476959516, SKID=, AKID=F9:DE:2F:E9:32:18:1D:91:9B:B9:E7:95:44:B2:F0:DC:64:1E:3D:CB failed: x509: certificate signed by unknown authority" Feb 21 00:19:20 crc kubenswrapper[4906]: I0221 00:19:20.245201 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.153522 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-index-1-build" podUID="c76dc364-84a8-4076-9206-56b74f0287b5" containerName="git-clone" containerID="cri-o://1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd" gracePeriod=30 Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.508509 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_c76dc364-84a8-4076-9206-56b74f0287b5/git-clone/0.log" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.508588 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.570782 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-system-configs\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.570839 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-run\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.570871 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-build-blob-cache\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.570887 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-root\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.570905 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-proxy-ca-bundles\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.570950 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-buildcachedir\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.571155 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.571197 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.571307 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.571793 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.571885 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.571955 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.571984 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.571997 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.572268 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.672249 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-node-pullsecrets\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.672306 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-buildworkdir\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.672343 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-pull\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.672369 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-push\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.672393 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbvpg\" (UniqueName: \"kubernetes.io/projected/c76dc364-84a8-4076-9206-56b74f0287b5-kube-api-access-cbvpg\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.672425 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-ca-bundles\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.672425 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.672449 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"c76dc364-84a8-4076-9206-56b74f0287b5\" (UID: \"c76dc364-84a8-4076-9206-56b74f0287b5\") " Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.673298 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.673643 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.673681 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.673732 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c76dc364-84a8-4076-9206-56b74f0287b5-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.673751 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c76dc364-84a8-4076-9206-56b74f0287b5-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.673773 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.673777 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.678579 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.681096 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-push" (OuterVolumeSpecName: "builder-dockercfg-rw7jx-push") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "builder-dockercfg-rw7jx-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.684436 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76dc364-84a8-4076-9206-56b74f0287b5-kube-api-access-cbvpg" (OuterVolumeSpecName: "kube-api-access-cbvpg") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "kube-api-access-cbvpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.684765 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-pull" (OuterVolumeSpecName: "builder-dockercfg-rw7jx-pull") pod "c76dc364-84a8-4076-9206-56b74f0287b5" (UID: "c76dc364-84a8-4076-9206-56b74f0287b5"). InnerVolumeSpecName "builder-dockercfg-rw7jx-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.774768 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-pull\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.774806 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-builder-dockercfg-rw7jx-push\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.774816 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbvpg\" (UniqueName: \"kubernetes.io/projected/c76dc364-84a8-4076-9206-56b74f0287b5-kube-api-access-cbvpg\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.774826 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76dc364-84a8-4076-9206-56b74f0287b5-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:21 crc kubenswrapper[4906]: I0221 00:19:21.774836 4906 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c76dc364-84a8-4076-9206-56b74f0287b5-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.163760 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_c76dc364-84a8-4076-9206-56b74f0287b5/git-clone/0.log" Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.163843 4906 generic.go:334] "Generic (PLEG): container finished" podID="c76dc364-84a8-4076-9206-56b74f0287b5" containerID="1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd" exitCode=1 Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.163888 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c76dc364-84a8-4076-9206-56b74f0287b5","Type":"ContainerDied","Data":"1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd"} Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.163929 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c76dc364-84a8-4076-9206-56b74f0287b5","Type":"ContainerDied","Data":"6248919b550800d1237893e34f3af49ef742d7357bd9e9c9b09f2d0e3dd6f79b"} Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.163963 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.163976 4906 scope.go:117] "RemoveContainer" containerID="1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd" Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.195600 4906 scope.go:117] "RemoveContainer" containerID="1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd" Feb 21 00:19:22 crc kubenswrapper[4906]: E0221 00:19:22.196195 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd\": container with ID starting with 1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd not found: ID does not exist" containerID="1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd" Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.196239 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd"} err="failed to get container status \"1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd\": rpc error: code = NotFound desc = could not find container \"1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd\": container with ID starting with 1f5edee6462aeb127ef0b77c720e528246366a247749ce0d6909b21289db37dd not found: ID does not exist" Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.212631 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Feb 21 00:19:22 crc kubenswrapper[4906]: I0221 00:19:22.225794 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Feb 21 00:19:23 crc kubenswrapper[4906]: I0221 00:19:23.530082 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76dc364-84a8-4076-9206-56b74f0287b5" path="/var/lib/kubelet/pods/c76dc364-84a8-4076-9206-56b74f0287b5/volumes" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.698778 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Feb 21 00:19:31 crc kubenswrapper[4906]: E0221 00:19:31.699726 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76dc364-84a8-4076-9206-56b74f0287b5" containerName="git-clone" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.699745 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76dc364-84a8-4076-9206-56b74f0287b5" containerName="git-clone" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.699862 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76dc364-84a8-4076-9206-56b74f0287b5" containerName="git-clone" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.700599 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.703566 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-2-ca" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.703791 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.704069 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-2-sys-config" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.704229 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-rw7jx" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.707170 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-2-global-ca" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712457 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildcachedir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712530 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712566 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712636 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712660 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-node-pullsecrets\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712702 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-blob-cache\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712729 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-run\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712761 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712788 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-root\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712814 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712869 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-system-configs\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712900 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildworkdir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.712922 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzlm\" (UniqueName: \"kubernetes.io/projected/02bf5bed-1456-44fb-ba97-8e81a25a184e-kube-api-access-4wzlm\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.730679 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813643 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-system-configs\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813713 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildworkdir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813742 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzlm\" (UniqueName: \"kubernetes.io/projected/02bf5bed-1456-44fb-ba97-8e81a25a184e-kube-api-access-4wzlm\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813771 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildcachedir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813804 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813829 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813871 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813893 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-node-pullsecrets\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813916 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-blob-cache\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813938 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-run\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813946 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildcachedir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813971 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.813999 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-root\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.814023 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.814047 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-node-pullsecrets\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.814370 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-system-configs\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.814621 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.814845 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-run\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.814965 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildworkdir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.815089 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.815098 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-blob-cache\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.815314 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-root\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.820645 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.820763 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.821168 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:31 crc kubenswrapper[4906]: I0221 00:19:31.843088 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzlm\" (UniqueName: \"kubernetes.io/projected/02bf5bed-1456-44fb-ba97-8e81a25a184e-kube-api-access-4wzlm\") pod \"service-telemetry-framework-index-2-build\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:32 crc kubenswrapper[4906]: I0221 00:19:32.030103 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:32 crc kubenswrapper[4906]: I0221 00:19:32.292709 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Feb 21 00:19:33 crc kubenswrapper[4906]: I0221 00:19:33.255578 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"02bf5bed-1456-44fb-ba97-8e81a25a184e","Type":"ContainerStarted","Data":"473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72"} Feb 21 00:19:33 crc kubenswrapper[4906]: I0221 00:19:33.255936 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"02bf5bed-1456-44fb-ba97-8e81a25a184e","Type":"ContainerStarted","Data":"c1639a76207c18fb0012623903821793b8116bd4062b4dd8616db87a0fc92ee2"} Feb 21 00:19:33 crc kubenswrapper[4906]: E0221 00:19:33.312532 4906 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3717330381476959516, SKID=, AKID=F9:DE:2F:E9:32:18:1D:91:9B:B9:E7:95:44:B2:F0:DC:64:1E:3D:CB failed: x509: certificate signed by unknown authority" Feb 21 00:19:34 crc kubenswrapper[4906]: I0221 00:19:34.341209 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.273634 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-index-2-build" podUID="02bf5bed-1456-44fb-ba97-8e81a25a184e" containerName="git-clone" containerID="cri-o://473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72" gracePeriod=30 Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.666816 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-2-build_02bf5bed-1456-44fb-ba97-8e81a25a184e/git-clone/0.log" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.666928 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.670830 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wzlm\" (UniqueName: \"kubernetes.io/projected/02bf5bed-1456-44fb-ba97-8e81a25a184e-kube-api-access-4wzlm\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.670933 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.670990 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildcachedir\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671044 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-run\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671092 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-system-configs\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671132 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-root\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671181 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-pull\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671272 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-push\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671310 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-node-pullsecrets\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671341 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-blob-cache\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671393 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildworkdir\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671446 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-proxy-ca-bundles\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671493 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-ca-bundles\") pod \"02bf5bed-1456-44fb-ba97-8e81a25a184e\" (UID: \"02bf5bed-1456-44fb-ba97-8e81a25a184e\") " Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671676 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671752 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671782 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671818 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671935 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671984 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.672032 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.672065 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/02bf5bed-1456-44fb-ba97-8e81a25a184e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.671959 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.672257 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.673021 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.673047 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.673219 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.677225 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-pull" (OuterVolumeSpecName: "builder-dockercfg-rw7jx-pull") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "builder-dockercfg-rw7jx-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.677269 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bf5bed-1456-44fb-ba97-8e81a25a184e-kube-api-access-4wzlm" (OuterVolumeSpecName: "kube-api-access-4wzlm") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "kube-api-access-4wzlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.677264 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.677972 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-push" (OuterVolumeSpecName: "builder-dockercfg-rw7jx-push") pod "02bf5bed-1456-44fb-ba97-8e81a25a184e" (UID: "02bf5bed-1456-44fb-ba97-8e81a25a184e"). InnerVolumeSpecName "builder-dockercfg-rw7jx-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773048 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773227 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wzlm\" (UniqueName: \"kubernetes.io/projected/02bf5bed-1456-44fb-ba97-8e81a25a184e-kube-api-access-4wzlm\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773330 4906 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773404 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773457 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-push\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773518 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/02bf5bed-1456-44fb-ba97-8e81a25a184e-builder-dockercfg-rw7jx-pull\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773573 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773626 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/02bf5bed-1456-44fb-ba97-8e81a25a184e-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:35 crc kubenswrapper[4906]: I0221 00:19:35.773676 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/02bf5bed-1456-44fb-ba97-8e81a25a184e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.284200 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-2-build_02bf5bed-1456-44fb-ba97-8e81a25a184e/git-clone/0.log" Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.284280 4906 generic.go:334] "Generic (PLEG): container finished" podID="02bf5bed-1456-44fb-ba97-8e81a25a184e" containerID="473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72" exitCode=1 Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.284328 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"02bf5bed-1456-44fb-ba97-8e81a25a184e","Type":"ContainerDied","Data":"473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72"} Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.284369 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"02bf5bed-1456-44fb-ba97-8e81a25a184e","Type":"ContainerDied","Data":"c1639a76207c18fb0012623903821793b8116bd4062b4dd8616db87a0fc92ee2"} Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.284398 4906 scope.go:117] "RemoveContainer" containerID="473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72" Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.284590 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-2-build" Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.313718 4906 scope.go:117] "RemoveContainer" containerID="473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72" Feb 21 00:19:36 crc kubenswrapper[4906]: E0221 00:19:36.314272 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72\": container with ID starting with 473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72 not found: ID does not exist" containerID="473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72" Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.314309 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72"} err="failed to get container status \"473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72\": rpc error: code = NotFound desc = could not find container \"473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72\": container with ID starting with 473fd35f3a181f0188de5757a71faf0c5b65dbdfd7ff02168844454b0bcafd72 not found: ID does not exist" Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.343448 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Feb 21 00:19:36 crc kubenswrapper[4906]: I0221 00:19:36.357884 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Feb 21 00:19:37 crc kubenswrapper[4906]: I0221 00:19:37.526589 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bf5bed-1456-44fb-ba97-8e81a25a184e" path="/var/lib/kubelet/pods/02bf5bed-1456-44fb-ba97-8e81a25a184e/volumes" Feb 21 00:19:43 crc kubenswrapper[4906]: I0221 00:19:43.123617 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:19:43 crc kubenswrapper[4906]: I0221 00:19:43.124047 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.779845 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-3-build"] Feb 21 00:19:45 crc kubenswrapper[4906]: E0221 00:19:45.780348 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bf5bed-1456-44fb-ba97-8e81a25a184e" containerName="git-clone" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.780361 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bf5bed-1456-44fb-ba97-8e81a25a184e" containerName="git-clone" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.780456 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="02bf5bed-1456-44fb-ba97-8e81a25a184e" containerName="git-clone" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.781267 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.783393 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.783490 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-3-sys-config" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.783495 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-3-ca" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.783487 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-rw7jx" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.787038 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-3-global-ca" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.815361 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-3-build"] Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852417 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852476 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-build-blob-cache\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852500 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852625 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-system-configs\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852666 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-buildcachedir\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852770 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-run\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852836 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852860 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-root\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852902 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-ca-bundles\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852921 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-buildworkdir\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852958 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.852975 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j79s\" (UniqueName: \"kubernetes.io/projected/c41a7100-4d12-4872-8738-2849a6efeba8-kube-api-access-7j79s\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.853111 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-node-pullsecrets\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954433 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-ca-bundles\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954491 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-buildworkdir\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954523 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954541 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j79s\" (UniqueName: \"kubernetes.io/projected/c41a7100-4d12-4872-8738-2849a6efeba8-kube-api-access-7j79s\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954565 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-node-pullsecrets\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954596 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954626 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-build-blob-cache\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954646 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954673 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-system-configs\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954693 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-buildcachedir\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954730 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-run\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954750 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954769 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-root\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.954946 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-buildcachedir\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.955182 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-node-pullsecrets\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.956189 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-ca-bundles\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.956550 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-root\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.956629 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-build-blob-cache\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.956838 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-run\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.956933 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-buildworkdir\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.957434 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-system-configs\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.957550 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.960394 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.967582 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.968225 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:45 crc kubenswrapper[4906]: I0221 00:19:45.970140 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j79s\" (UniqueName: \"kubernetes.io/projected/c41a7100-4d12-4872-8738-2849a6efeba8-kube-api-access-7j79s\") pod \"service-telemetry-framework-index-3-build\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:46 crc kubenswrapper[4906]: I0221 00:19:46.104876 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:46 crc kubenswrapper[4906]: I0221 00:19:46.358221 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-3-build"] Feb 21 00:19:47 crc kubenswrapper[4906]: I0221 00:19:47.364033 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-3-build" event={"ID":"c41a7100-4d12-4872-8738-2849a6efeba8","Type":"ContainerStarted","Data":"d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc"} Feb 21 00:19:47 crc kubenswrapper[4906]: I0221 00:19:47.364354 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-3-build" event={"ID":"c41a7100-4d12-4872-8738-2849a6efeba8","Type":"ContainerStarted","Data":"70363cc25beb5c0b69c96dacd7e45b53b4afe73bae56c1b4adfe1a51f6385bcc"} Feb 21 00:19:47 crc kubenswrapper[4906]: E0221 00:19:47.437468 4906 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3717330381476959516, SKID=, AKID=F9:DE:2F:E9:32:18:1D:91:9B:B9:E7:95:44:B2:F0:DC:64:1E:3D:CB failed: x509: certificate signed by unknown authority" Feb 21 00:19:48 crc kubenswrapper[4906]: I0221 00:19:48.476315 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-3-build"] Feb 21 00:19:49 crc kubenswrapper[4906]: I0221 00:19:49.379354 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-index-3-build" podUID="c41a7100-4d12-4872-8738-2849a6efeba8" containerName="git-clone" containerID="cri-o://d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc" gracePeriod=30 Feb 21 00:19:49 crc kubenswrapper[4906]: I0221 00:19:49.871460 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-3-build_c41a7100-4d12-4872-8738-2849a6efeba8/git-clone/0.log" Feb 21 00:19:49 crc kubenswrapper[4906]: I0221 00:19:49.871803 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.014324 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j79s\" (UniqueName: \"kubernetes.io/projected/c41a7100-4d12-4872-8738-2849a6efeba8-kube-api-access-7j79s\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.014614 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-system-configs\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.014795 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-root\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.014966 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015095 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-proxy-ca-bundles\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015226 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-ca-bundles\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015082 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015332 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-node-pullsecrets\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015423 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-buildcachedir\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015481 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-pull\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015503 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-push\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015529 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-build-blob-cache\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015552 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015564 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015581 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-run\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015612 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-buildworkdir\") pod \"c41a7100-4d12-4872-8738-2849a6efeba8\" (UID: \"c41a7100-4d12-4872-8738-2849a6efeba8\") " Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015665 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015728 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.015845 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016013 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016018 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016036 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016045 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016053 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c41a7100-4d12-4872-8738-2849a6efeba8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016052 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016063 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016302 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.016546 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.020455 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-pull" (OuterVolumeSpecName: "builder-dockercfg-rw7jx-pull") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "builder-dockercfg-rw7jx-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.020534 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.020555 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-push" (OuterVolumeSpecName: "builder-dockercfg-rw7jx-push") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "builder-dockercfg-rw7jx-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.021952 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41a7100-4d12-4872-8738-2849a6efeba8-kube-api-access-7j79s" (OuterVolumeSpecName: "kube-api-access-7j79s") pod "c41a7100-4d12-4872-8738-2849a6efeba8" (UID: "c41a7100-4d12-4872-8738-2849a6efeba8"). InnerVolumeSpecName "kube-api-access-7j79s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.118858 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.118898 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c41a7100-4d12-4872-8738-2849a6efeba8-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.118910 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j79s\" (UniqueName: \"kubernetes.io/projected/c41a7100-4d12-4872-8738-2849a6efeba8-kube-api-access-7j79s\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.118923 4906 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.118940 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c41a7100-4d12-4872-8738-2849a6efeba8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.118956 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-push\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.118975 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/c41a7100-4d12-4872-8738-2849a6efeba8-builder-dockercfg-rw7jx-pull\") on node \"crc\" DevicePath \"\"" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.388158 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-3-build_c41a7100-4d12-4872-8738-2849a6efeba8/git-clone/0.log" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.388548 4906 generic.go:334] "Generic (PLEG): container finished" podID="c41a7100-4d12-4872-8738-2849a6efeba8" containerID="d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc" exitCode=1 Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.388596 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-3-build" event={"ID":"c41a7100-4d12-4872-8738-2849a6efeba8","Type":"ContainerDied","Data":"d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc"} Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.388637 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-3-build" event={"ID":"c41a7100-4d12-4872-8738-2849a6efeba8","Type":"ContainerDied","Data":"70363cc25beb5c0b69c96dacd7e45b53b4afe73bae56c1b4adfe1a51f6385bcc"} Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.388666 4906 scope.go:117] "RemoveContainer" containerID="d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.388937 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-3-build" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.413912 4906 scope.go:117] "RemoveContainer" containerID="d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc" Feb 21 00:19:50 crc kubenswrapper[4906]: E0221 00:19:50.414835 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc\": container with ID starting with d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc not found: ID does not exist" containerID="d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.414877 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc"} err="failed to get container status \"d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc\": rpc error: code = NotFound desc = could not find container \"d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc\": container with ID starting with d7a79c940e5c82fcddb774703526cc696c02a55907c8049e9563b675f9ce50bc not found: ID does not exist" Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.448880 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-3-build"] Feb 21 00:19:50 crc kubenswrapper[4906]: I0221 00:19:50.457694 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-index-3-build"] Feb 21 00:19:51 crc kubenswrapper[4906]: I0221 00:19:51.531015 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41a7100-4d12-4872-8738-2849a6efeba8" path="/var/lib/kubelet/pods/c41a7100-4d12-4872-8738-2849a6efeba8/volumes" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.944828 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-4-build"] Feb 21 00:19:59 crc kubenswrapper[4906]: E0221 00:19:59.945870 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41a7100-4d12-4872-8738-2849a6efeba8" containerName="git-clone" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.945892 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41a7100-4d12-4872-8738-2849a6efeba8" containerName="git-clone" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.946054 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41a7100-4d12-4872-8738-2849a6efeba8" containerName="git-clone" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.947355 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951040 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-root\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951109 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj6pp\" (UniqueName: \"kubernetes.io/projected/387c8b2c-8ffa-42e1-8818-37f9795384d1-kube-api-access-kj6pp\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951141 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildcachedir\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951171 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951222 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951255 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951298 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-ca-bundles\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951324 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-run\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951351 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-system-configs\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951402 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-node-pullsecrets\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951763 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951797 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951856 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildworkdir\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.951892 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-blob-cache\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.952415 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-4-sys-config" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.952714 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-4-ca" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.953453 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-rw7jx" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.955733 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-4-global-ca" Feb 21 00:19:59 crc kubenswrapper[4906]: I0221 00:19:59.986775 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-4-build"] Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053556 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-ca-bundles\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053610 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-run\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053641 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-system-configs\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053711 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-node-pullsecrets\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053760 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053783 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildworkdir\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053803 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-blob-cache\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053844 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-root\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053864 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj6pp\" (UniqueName: \"kubernetes.io/projected/387c8b2c-8ffa-42e1-8818-37f9795384d1-kube-api-access-kj6pp\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053881 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildcachedir\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053897 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053929 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053948 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.053996 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-node-pullsecrets\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.054085 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildcachedir\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.054364 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildworkdir\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.054475 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-blob-cache\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.054600 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-root\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.054633 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-run\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.055652 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-system-configs\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.055822 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.055936 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-ca-bundles\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.062151 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-push\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.062355 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-pull\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.062376 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.070158 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj6pp\" (UniqueName: \"kubernetes.io/projected/387c8b2c-8ffa-42e1-8818-37f9795384d1-kube-api-access-kj6pp\") pod \"service-telemetry-framework-index-4-build\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.276458 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:00 crc kubenswrapper[4906]: I0221 00:20:00.522394 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-4-build"] Feb 21 00:20:01 crc kubenswrapper[4906]: I0221 00:20:01.473194 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-4-build" event={"ID":"387c8b2c-8ffa-42e1-8818-37f9795384d1","Type":"ContainerStarted","Data":"3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992"} Feb 21 00:20:01 crc kubenswrapper[4906]: I0221 00:20:01.473247 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-4-build" event={"ID":"387c8b2c-8ffa-42e1-8818-37f9795384d1","Type":"ContainerStarted","Data":"b1cb35a5f14ea5014a11c196459005c6a58260ebfe9ae441d0553f2fed917947"} Feb 21 00:20:01 crc kubenswrapper[4906]: E0221 00:20:01.550628 4906 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3717330381476959516, SKID=, AKID=F9:DE:2F:E9:32:18:1D:91:9B:B9:E7:95:44:B2:F0:DC:64:1E:3D:CB failed: x509: certificate signed by unknown authority" Feb 21 00:20:02 crc kubenswrapper[4906]: I0221 00:20:02.586606 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-4-build"] Feb 21 00:20:03 crc kubenswrapper[4906]: I0221 00:20:03.487511 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-index-4-build" podUID="387c8b2c-8ffa-42e1-8818-37f9795384d1" containerName="git-clone" containerID="cri-o://3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992" gracePeriod=30 Feb 21 00:20:03 crc kubenswrapper[4906]: I0221 00:20:03.892267 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-4-build_387c8b2c-8ffa-42e1-8818-37f9795384d1/git-clone/0.log" Feb 21 00:20:03 crc kubenswrapper[4906]: I0221 00:20:03.892344 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003211 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj6pp\" (UniqueName: \"kubernetes.io/projected/387c8b2c-8ffa-42e1-8818-37f9795384d1-kube-api-access-kj6pp\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003305 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003347 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-root\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003392 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-blob-cache\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003473 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-ca-bundles\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003511 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-run\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003546 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-proxy-ca-bundles\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003602 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildworkdir\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003638 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-pull\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003669 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-push\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003737 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-node-pullsecrets\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003784 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildcachedir\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.003833 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-system-configs\") pod \"387c8b2c-8ffa-42e1-8818-37f9795384d1\" (UID: \"387c8b2c-8ffa-42e1-8818-37f9795384d1\") " Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004126 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004160 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004317 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004343 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004321 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004367 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004513 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004566 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.004607 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.005014 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.005157 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.015964 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-pull" (OuterVolumeSpecName: "builder-dockercfg-rw7jx-pull") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "builder-dockercfg-rw7jx-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.015978 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387c8b2c-8ffa-42e1-8818-37f9795384d1-kube-api-access-kj6pp" (OuterVolumeSpecName: "kube-api-access-kj6pp") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "kube-api-access-kj6pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.015956 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.016066 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-push" (OuterVolumeSpecName: "builder-dockercfg-rw7jx-push") pod "387c8b2c-8ffa-42e1-8818-37f9795384d1" (UID: "387c8b2c-8ffa-42e1-8818-37f9795384d1"). InnerVolumeSpecName "builder-dockercfg-rw7jx-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105459 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105861 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105878 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105890 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105902 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-rw7jx-pull\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-pull\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105914 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-rw7jx-push\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-builder-dockercfg-rw7jx-push\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105926 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105937 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj6pp\" (UniqueName: \"kubernetes.io/projected/387c8b2c-8ffa-42e1-8818-37f9795384d1-kube-api-access-kj6pp\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105950 4906 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/387c8b2c-8ffa-42e1-8818-37f9795384d1-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105962 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.105974 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/387c8b2c-8ffa-42e1-8818-37f9795384d1-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.500215 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-4-build_387c8b2c-8ffa-42e1-8818-37f9795384d1/git-clone/0.log" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.500300 4906 generic.go:334] "Generic (PLEG): container finished" podID="387c8b2c-8ffa-42e1-8818-37f9795384d1" containerID="3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992" exitCode=1 Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.500345 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-4-build" event={"ID":"387c8b2c-8ffa-42e1-8818-37f9795384d1","Type":"ContainerDied","Data":"3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992"} Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.500397 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-4-build" event={"ID":"387c8b2c-8ffa-42e1-8818-37f9795384d1","Type":"ContainerDied","Data":"b1cb35a5f14ea5014a11c196459005c6a58260ebfe9ae441d0553f2fed917947"} Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.500407 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-4-build" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.500447 4906 scope.go:117] "RemoveContainer" containerID="3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.518670 4906 scope.go:117] "RemoveContainer" containerID="3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992" Feb 21 00:20:04 crc kubenswrapper[4906]: E0221 00:20:04.519519 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992\": container with ID starting with 3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992 not found: ID does not exist" containerID="3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.519617 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992"} err="failed to get container status \"3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992\": rpc error: code = NotFound desc = could not find container \"3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992\": container with ID starting with 3082d7272a6fc206d6f70a83d64d97aa491686680b8d12c31d095cb0cbfdc992 not found: ID does not exist" Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.552400 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-4-build"] Feb 21 00:20:04 crc kubenswrapper[4906]: I0221 00:20:04.568122 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-index-4-build"] Feb 21 00:20:05 crc kubenswrapper[4906]: I0221 00:20:05.528028 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="387c8b2c-8ffa-42e1-8818-37f9795384d1" path="/var/lib/kubelet/pods/387c8b2c-8ffa-42e1-8818-37f9795384d1/volumes" Feb 21 00:20:05 crc kubenswrapper[4906]: I0221 00:20:05.874198 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-l49m7"] Feb 21 00:20:05 crc kubenswrapper[4906]: E0221 00:20:05.874851 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387c8b2c-8ffa-42e1-8818-37f9795384d1" containerName="git-clone" Feb 21 00:20:05 crc kubenswrapper[4906]: I0221 00:20:05.874869 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="387c8b2c-8ffa-42e1-8818-37f9795384d1" containerName="git-clone" Feb 21 00:20:05 crc kubenswrapper[4906]: I0221 00:20:05.875019 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="387c8b2c-8ffa-42e1-8818-37f9795384d1" containerName="git-clone" Feb 21 00:20:05 crc kubenswrapper[4906]: I0221 00:20:05.875508 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-l49m7" Feb 21 00:20:05 crc kubenswrapper[4906]: I0221 00:20:05.884200 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-ttjc6" Feb 21 00:20:05 crc kubenswrapper[4906]: I0221 00:20:05.895180 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-l49m7"] Feb 21 00:20:06 crc kubenswrapper[4906]: I0221 00:20:06.029444 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s9w5\" (UniqueName: \"kubernetes.io/projected/6acb63f4-309c-49bc-b8e6-007db92e699a-kube-api-access-8s9w5\") pod \"infrawatch-operators-l49m7\" (UID: \"6acb63f4-309c-49bc-b8e6-007db92e699a\") " pod="service-telemetry/infrawatch-operators-l49m7" Feb 21 00:20:06 crc kubenswrapper[4906]: I0221 00:20:06.130580 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s9w5\" (UniqueName: \"kubernetes.io/projected/6acb63f4-309c-49bc-b8e6-007db92e699a-kube-api-access-8s9w5\") pod \"infrawatch-operators-l49m7\" (UID: \"6acb63f4-309c-49bc-b8e6-007db92e699a\") " pod="service-telemetry/infrawatch-operators-l49m7" Feb 21 00:20:06 crc kubenswrapper[4906]: I0221 00:20:06.152731 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s9w5\" (UniqueName: \"kubernetes.io/projected/6acb63f4-309c-49bc-b8e6-007db92e699a-kube-api-access-8s9w5\") pod \"infrawatch-operators-l49m7\" (UID: \"6acb63f4-309c-49bc-b8e6-007db92e699a\") " pod="service-telemetry/infrawatch-operators-l49m7" Feb 21 00:20:06 crc kubenswrapper[4906]: I0221 00:20:06.207372 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-l49m7" Feb 21 00:20:06 crc kubenswrapper[4906]: I0221 00:20:06.499676 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-l49m7"] Feb 21 00:20:06 crc kubenswrapper[4906]: I0221 00:20:06.516987 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-l49m7" event={"ID":"6acb63f4-309c-49bc-b8e6-007db92e699a","Type":"ContainerStarted","Data":"640e01cde1e0a2fa96402b8b4f77ef4726a3b65383db5cb1fea0823f4e698026"} Feb 21 00:20:06 crc kubenswrapper[4906]: E0221 00:20:06.549492 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:20:06 crc kubenswrapper[4906]: E0221 00:20:06.549640 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8s9w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-l49m7_service-telemetry(6acb63f4-309c-49bc-b8e6-007db92e699a): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:20:06 crc kubenswrapper[4906]: E0221 00:20:06.551154 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:20:07 crc kubenswrapper[4906]: E0221 00:20:07.529251 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:20:13 crc kubenswrapper[4906]: I0221 00:20:13.124231 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:20:13 crc kubenswrapper[4906]: I0221 00:20:13.124904 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:20:19 crc kubenswrapper[4906]: E0221 00:20:19.554593 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:20:19 crc kubenswrapper[4906]: E0221 00:20:19.555948 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8s9w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-l49m7_service-telemetry(6acb63f4-309c-49bc-b8e6-007db92e699a): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:20:19 crc kubenswrapper[4906]: E0221 00:20:19.557175 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:20:20 crc kubenswrapper[4906]: I0221 00:20:20.709139 4906 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 21 00:20:34 crc kubenswrapper[4906]: E0221 00:20:34.520258 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.124746 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.125446 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.125504 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.126200 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ba16906db7f4c001a4d75d4a61fcd36e0dcef9b8dae4fc267bc90aeebbab0e4"} pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.126272 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" containerID="cri-o://5ba16906db7f4c001a4d75d4a61fcd36e0dcef9b8dae4fc267bc90aeebbab0e4" gracePeriod=600 Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.823572 4906 generic.go:334] "Generic (PLEG): container finished" podID="17518505-fa81-4399-b6cd-5527dae35ef3" containerID="5ba16906db7f4c001a4d75d4a61fcd36e0dcef9b8dae4fc267bc90aeebbab0e4" exitCode=0 Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.823892 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerDied","Data":"5ba16906db7f4c001a4d75d4a61fcd36e0dcef9b8dae4fc267bc90aeebbab0e4"} Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.824580 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"d21d8b8565b08c95fd01c5784995c0e3e1e16856e6912945637d28380ef3cb46"} Feb 21 00:20:43 crc kubenswrapper[4906]: I0221 00:20:43.824624 4906 scope.go:117] "RemoveContainer" containerID="6169b545a7a99c9b79ad6bca069941dcc1b133c237cbad0be59cce8e9e5cf28f" Feb 21 00:20:45 crc kubenswrapper[4906]: E0221 00:20:45.567646 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:20:45 crc kubenswrapper[4906]: E0221 00:20:45.568366 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8s9w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-l49m7_service-telemetry(6acb63f4-309c-49bc-b8e6-007db92e699a): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:20:45 crc kubenswrapper[4906]: E0221 00:20:45.569604 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:20:56 crc kubenswrapper[4906]: E0221 00:20:56.519846 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:21:08 crc kubenswrapper[4906]: E0221 00:21:08.522956 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:21:22 crc kubenswrapper[4906]: E0221 00:21:22.518320 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:21:37 crc kubenswrapper[4906]: E0221 00:21:37.571554 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:21:37 crc kubenswrapper[4906]: E0221 00:21:37.572345 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8s9w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-l49m7_service-telemetry(6acb63f4-309c-49bc-b8e6-007db92e699a): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:21:37 crc kubenswrapper[4906]: E0221 00:21:37.573673 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:21:48 crc kubenswrapper[4906]: E0221 00:21:48.519870 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:22:02 crc kubenswrapper[4906]: E0221 00:22:02.520361 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:22:13 crc kubenswrapper[4906]: E0221 00:22:13.520789 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:22:25 crc kubenswrapper[4906]: E0221 00:22:25.521773 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.269611 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-snbht"] Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.271532 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.288576 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snbht"] Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.396358 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95qtc\" (UniqueName: \"kubernetes.io/projected/7845bc06-869f-4955-82e6-453b28d8141c-kube-api-access-95qtc\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.396420 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-utilities\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.396460 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-catalog-content\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.498608 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95qtc\" (UniqueName: \"kubernetes.io/projected/7845bc06-869f-4955-82e6-453b28d8141c-kube-api-access-95qtc\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.498673 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-utilities\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.498760 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-catalog-content\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.499259 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-catalog-content\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.499322 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-utilities\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.531560 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95qtc\" (UniqueName: \"kubernetes.io/projected/7845bc06-869f-4955-82e6-453b28d8141c-kube-api-access-95qtc\") pod \"redhat-operators-snbht\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.596346 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:26 crc kubenswrapper[4906]: I0221 00:22:26.832434 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snbht"] Feb 21 00:22:27 crc kubenswrapper[4906]: I0221 00:22:27.622012 4906 generic.go:334] "Generic (PLEG): container finished" podID="7845bc06-869f-4955-82e6-453b28d8141c" containerID="519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f" exitCode=0 Feb 21 00:22:27 crc kubenswrapper[4906]: I0221 00:22:27.622166 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snbht" event={"ID":"7845bc06-869f-4955-82e6-453b28d8141c","Type":"ContainerDied","Data":"519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f"} Feb 21 00:22:27 crc kubenswrapper[4906]: I0221 00:22:27.622449 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snbht" event={"ID":"7845bc06-869f-4955-82e6-453b28d8141c","Type":"ContainerStarted","Data":"2d16d360adbb47a6cf0323d2e74585388a39e7152bd21342c86d0bcde8b1d554"} Feb 21 00:22:28 crc kubenswrapper[4906]: I0221 00:22:28.632805 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snbht" event={"ID":"7845bc06-869f-4955-82e6-453b28d8141c","Type":"ContainerStarted","Data":"184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a"} Feb 21 00:22:29 crc kubenswrapper[4906]: I0221 00:22:29.642760 4906 generic.go:334] "Generic (PLEG): container finished" podID="7845bc06-869f-4955-82e6-453b28d8141c" containerID="184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a" exitCode=0 Feb 21 00:22:29 crc kubenswrapper[4906]: I0221 00:22:29.642812 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snbht" event={"ID":"7845bc06-869f-4955-82e6-453b28d8141c","Type":"ContainerDied","Data":"184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a"} Feb 21 00:22:30 crc kubenswrapper[4906]: I0221 00:22:30.653100 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snbht" event={"ID":"7845bc06-869f-4955-82e6-453b28d8141c","Type":"ContainerStarted","Data":"75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35"} Feb 21 00:22:30 crc kubenswrapper[4906]: I0221 00:22:30.680578 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-snbht" podStartSLOduration=2.269830422 podStartE2EDuration="4.6805525s" podCreationTimestamp="2026-02-21 00:22:26 +0000 UTC" firstStartedPulling="2026-02-21 00:22:27.623957095 +0000 UTC m=+882.875544621" lastFinishedPulling="2026-02-21 00:22:30.034679193 +0000 UTC m=+885.286266699" observedRunningTime="2026-02-21 00:22:30.678624175 +0000 UTC m=+885.930211681" watchObservedRunningTime="2026-02-21 00:22:30.6805525 +0000 UTC m=+885.932140036" Feb 21 00:22:36 crc kubenswrapper[4906]: I0221 00:22:36.596884 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:36 crc kubenswrapper[4906]: I0221 00:22:36.597162 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:37 crc kubenswrapper[4906]: I0221 00:22:37.632334 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snbht" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="registry-server" probeResult="failure" output=< Feb 21 00:22:37 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Feb 21 00:22:37 crc kubenswrapper[4906]: > Feb 21 00:22:39 crc kubenswrapper[4906]: E0221 00:22:39.519057 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:22:43 crc kubenswrapper[4906]: I0221 00:22:43.124466 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:22:43 crc kubenswrapper[4906]: I0221 00:22:43.125016 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:22:46 crc kubenswrapper[4906]: I0221 00:22:46.669357 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:46 crc kubenswrapper[4906]: I0221 00:22:46.744282 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:46 crc kubenswrapper[4906]: I0221 00:22:46.914903 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snbht"] Feb 21 00:22:47 crc kubenswrapper[4906]: I0221 00:22:47.788580 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-snbht" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="registry-server" containerID="cri-o://75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35" gracePeriod=2 Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.242154 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.361358 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95qtc\" (UniqueName: \"kubernetes.io/projected/7845bc06-869f-4955-82e6-453b28d8141c-kube-api-access-95qtc\") pod \"7845bc06-869f-4955-82e6-453b28d8141c\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.361441 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-catalog-content\") pod \"7845bc06-869f-4955-82e6-453b28d8141c\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.361483 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-utilities\") pod \"7845bc06-869f-4955-82e6-453b28d8141c\" (UID: \"7845bc06-869f-4955-82e6-453b28d8141c\") " Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.363137 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-utilities" (OuterVolumeSpecName: "utilities") pod "7845bc06-869f-4955-82e6-453b28d8141c" (UID: "7845bc06-869f-4955-82e6-453b28d8141c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.368272 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7845bc06-869f-4955-82e6-453b28d8141c-kube-api-access-95qtc" (OuterVolumeSpecName: "kube-api-access-95qtc") pod "7845bc06-869f-4955-82e6-453b28d8141c" (UID: "7845bc06-869f-4955-82e6-453b28d8141c"). InnerVolumeSpecName "kube-api-access-95qtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.463615 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95qtc\" (UniqueName: \"kubernetes.io/projected/7845bc06-869f-4955-82e6-453b28d8141c-kube-api-access-95qtc\") on node \"crc\" DevicePath \"\"" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.463745 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.503473 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7845bc06-869f-4955-82e6-453b28d8141c" (UID: "7845bc06-869f-4955-82e6-453b28d8141c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.566140 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845bc06-869f-4955-82e6-453b28d8141c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.799736 4906 generic.go:334] "Generic (PLEG): container finished" podID="7845bc06-869f-4955-82e6-453b28d8141c" containerID="75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35" exitCode=0 Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.799807 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snbht" event={"ID":"7845bc06-869f-4955-82e6-453b28d8141c","Type":"ContainerDied","Data":"75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35"} Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.799816 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snbht" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.799848 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snbht" event={"ID":"7845bc06-869f-4955-82e6-453b28d8141c","Type":"ContainerDied","Data":"2d16d360adbb47a6cf0323d2e74585388a39e7152bd21342c86d0bcde8b1d554"} Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.799877 4906 scope.go:117] "RemoveContainer" containerID="75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.830418 4906 scope.go:117] "RemoveContainer" containerID="184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.844225 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snbht"] Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.849581 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-snbht"] Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.873945 4906 scope.go:117] "RemoveContainer" containerID="519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.900188 4906 scope.go:117] "RemoveContainer" containerID="75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35" Feb 21 00:22:48 crc kubenswrapper[4906]: E0221 00:22:48.900781 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35\": container with ID starting with 75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35 not found: ID does not exist" containerID="75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.900812 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35"} err="failed to get container status \"75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35\": rpc error: code = NotFound desc = could not find container \"75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35\": container with ID starting with 75bbe0dd016c059befbd3dc4540f90efd941bc3244e3802f6e90db6a15fe3f35 not found: ID does not exist" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.900831 4906 scope.go:117] "RemoveContainer" containerID="184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a" Feb 21 00:22:48 crc kubenswrapper[4906]: E0221 00:22:48.901458 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a\": container with ID starting with 184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a not found: ID does not exist" containerID="184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.901591 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a"} err="failed to get container status \"184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a\": rpc error: code = NotFound desc = could not find container \"184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a\": container with ID starting with 184c90a0ce0f56eb996744fc47c199d6912f7ee00d2eec8ec158eec86f96fa2a not found: ID does not exist" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.901657 4906 scope.go:117] "RemoveContainer" containerID="519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f" Feb 21 00:22:48 crc kubenswrapper[4906]: E0221 00:22:48.902196 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f\": container with ID starting with 519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f not found: ID does not exist" containerID="519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f" Feb 21 00:22:48 crc kubenswrapper[4906]: I0221 00:22:48.902228 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f"} err="failed to get container status \"519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f\": rpc error: code = NotFound desc = could not find container \"519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f\": container with ID starting with 519ddf5a019b26dbd8ba81b82f750bce49c4d25d67ed44d83bb1f40ea49ad52f not found: ID does not exist" Feb 21 00:22:49 crc kubenswrapper[4906]: I0221 00:22:49.531089 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7845bc06-869f-4955-82e6-453b28d8141c" path="/var/lib/kubelet/pods/7845bc06-869f-4955-82e6-453b28d8141c/volumes" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.324730 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xkpld"] Feb 21 00:22:51 crc kubenswrapper[4906]: E0221 00:22:51.325269 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="extract-utilities" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.325283 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="extract-utilities" Feb 21 00:22:51 crc kubenswrapper[4906]: E0221 00:22:51.325295 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="registry-server" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.325303 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="registry-server" Feb 21 00:22:51 crc kubenswrapper[4906]: E0221 00:22:51.325322 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="extract-content" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.325332 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="extract-content" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.325455 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="7845bc06-869f-4955-82e6-453b28d8141c" containerName="registry-server" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.326396 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.347492 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xkpld"] Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.408678 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-catalog-content\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.408752 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7k6b\" (UniqueName: \"kubernetes.io/projected/d5eb460f-49c0-483a-89c5-572d360c2850-kube-api-access-q7k6b\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.408992 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-utilities\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.510311 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-utilities\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.510428 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-catalog-content\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.510454 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7k6b\" (UniqueName: \"kubernetes.io/projected/d5eb460f-49c0-483a-89c5-572d360c2850-kube-api-access-q7k6b\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.510907 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-utilities\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.510992 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-catalog-content\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.538803 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7k6b\" (UniqueName: \"kubernetes.io/projected/d5eb460f-49c0-483a-89c5-572d360c2850-kube-api-access-q7k6b\") pod \"community-operators-xkpld\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.654121 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:22:51 crc kubenswrapper[4906]: I0221 00:22:51.971891 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xkpld"] Feb 21 00:22:52 crc kubenswrapper[4906]: E0221 00:22:52.518219 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:22:52 crc kubenswrapper[4906]: I0221 00:22:52.829169 4906 generic.go:334] "Generic (PLEG): container finished" podID="d5eb460f-49c0-483a-89c5-572d360c2850" containerID="a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073" exitCode=0 Feb 21 00:22:52 crc kubenswrapper[4906]: I0221 00:22:52.829213 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkpld" event={"ID":"d5eb460f-49c0-483a-89c5-572d360c2850","Type":"ContainerDied","Data":"a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073"} Feb 21 00:22:52 crc kubenswrapper[4906]: I0221 00:22:52.829240 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkpld" event={"ID":"d5eb460f-49c0-483a-89c5-572d360c2850","Type":"ContainerStarted","Data":"bc901cfb93c962f05dc3137cf4d3972d557e4e74d0f609b161f372fc03fac194"} Feb 21 00:22:52 crc kubenswrapper[4906]: I0221 00:22:52.831267 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 00:22:53 crc kubenswrapper[4906]: I0221 00:22:53.836849 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkpld" event={"ID":"d5eb460f-49c0-483a-89c5-572d360c2850","Type":"ContainerStarted","Data":"03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44"} Feb 21 00:22:54 crc kubenswrapper[4906]: I0221 00:22:54.844800 4906 generic.go:334] "Generic (PLEG): container finished" podID="d5eb460f-49c0-483a-89c5-572d360c2850" containerID="03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44" exitCode=0 Feb 21 00:22:54 crc kubenswrapper[4906]: I0221 00:22:54.844844 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkpld" event={"ID":"d5eb460f-49c0-483a-89c5-572d360c2850","Type":"ContainerDied","Data":"03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44"} Feb 21 00:22:55 crc kubenswrapper[4906]: I0221 00:22:55.863087 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkpld" event={"ID":"d5eb460f-49c0-483a-89c5-572d360c2850","Type":"ContainerStarted","Data":"cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f"} Feb 21 00:22:55 crc kubenswrapper[4906]: I0221 00:22:55.882850 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xkpld" podStartSLOduration=2.47616811 podStartE2EDuration="4.882831081s" podCreationTimestamp="2026-02-21 00:22:51 +0000 UTC" firstStartedPulling="2026-02-21 00:22:52.830895355 +0000 UTC m=+908.082482871" lastFinishedPulling="2026-02-21 00:22:55.237558326 +0000 UTC m=+910.489145842" observedRunningTime="2026-02-21 00:22:55.881172197 +0000 UTC m=+911.132759703" watchObservedRunningTime="2026-02-21 00:22:55.882831081 +0000 UTC m=+911.134418587" Feb 21 00:23:01 crc kubenswrapper[4906]: I0221 00:23:01.655015 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:23:01 crc kubenswrapper[4906]: I0221 00:23:01.656398 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:23:01 crc kubenswrapper[4906]: I0221 00:23:01.691939 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:23:01 crc kubenswrapper[4906]: I0221 00:23:01.947586 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:23:01 crc kubenswrapper[4906]: I0221 00:23:01.998675 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xkpld"] Feb 21 00:23:03 crc kubenswrapper[4906]: E0221 00:23:03.555701 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:23:03 crc kubenswrapper[4906]: E0221 00:23:03.556145 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8s9w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-l49m7_service-telemetry(6acb63f4-309c-49bc-b8e6-007db92e699a): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:23:03 crc kubenswrapper[4906]: E0221 00:23:03.557446 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:23:03 crc kubenswrapper[4906]: I0221 00:23:03.917194 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xkpld" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" containerName="registry-server" containerID="cri-o://cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f" gracePeriod=2 Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.805156 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.890463 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-catalog-content\") pod \"d5eb460f-49c0-483a-89c5-572d360c2850\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.890526 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-utilities\") pod \"d5eb460f-49c0-483a-89c5-572d360c2850\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.890625 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7k6b\" (UniqueName: \"kubernetes.io/projected/d5eb460f-49c0-483a-89c5-572d360c2850-kube-api-access-q7k6b\") pod \"d5eb460f-49c0-483a-89c5-572d360c2850\" (UID: \"d5eb460f-49c0-483a-89c5-572d360c2850\") " Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.891550 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-utilities" (OuterVolumeSpecName: "utilities") pod "d5eb460f-49c0-483a-89c5-572d360c2850" (UID: "d5eb460f-49c0-483a-89c5-572d360c2850"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.903854 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5eb460f-49c0-483a-89c5-572d360c2850-kube-api-access-q7k6b" (OuterVolumeSpecName: "kube-api-access-q7k6b") pod "d5eb460f-49c0-483a-89c5-572d360c2850" (UID: "d5eb460f-49c0-483a-89c5-572d360c2850"). InnerVolumeSpecName "kube-api-access-q7k6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.923027 4906 generic.go:334] "Generic (PLEG): container finished" podID="d5eb460f-49c0-483a-89c5-572d360c2850" containerID="cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f" exitCode=0 Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.923066 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkpld" event={"ID":"d5eb460f-49c0-483a-89c5-572d360c2850","Type":"ContainerDied","Data":"cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f"} Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.923092 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xkpld" event={"ID":"d5eb460f-49c0-483a-89c5-572d360c2850","Type":"ContainerDied","Data":"bc901cfb93c962f05dc3137cf4d3972d557e4e74d0f609b161f372fc03fac194"} Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.923108 4906 scope.go:117] "RemoveContainer" containerID="cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.923393 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xkpld" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.939379 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5eb460f-49c0-483a-89c5-572d360c2850" (UID: "d5eb460f-49c0-483a-89c5-572d360c2850"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.940215 4906 scope.go:117] "RemoveContainer" containerID="03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.961738 4906 scope.go:117] "RemoveContainer" containerID="a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.973845 4906 scope.go:117] "RemoveContainer" containerID="cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f" Feb 21 00:23:04 crc kubenswrapper[4906]: E0221 00:23:04.974239 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f\": container with ID starting with cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f not found: ID does not exist" containerID="cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.974269 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f"} err="failed to get container status \"cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f\": rpc error: code = NotFound desc = could not find container \"cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f\": container with ID starting with cab85143c2b381b2d6d2482ab7d0ce80a2efd13bd2f5852676b0c5947719679f not found: ID does not exist" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.974289 4906 scope.go:117] "RemoveContainer" containerID="03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44" Feb 21 00:23:04 crc kubenswrapper[4906]: E0221 00:23:04.974761 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44\": container with ID starting with 03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44 not found: ID does not exist" containerID="03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.974852 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44"} err="failed to get container status \"03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44\": rpc error: code = NotFound desc = could not find container \"03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44\": container with ID starting with 03703e3e2191fc154375e01544b4c9233304038c7ffbc4d7577f5a409b5efe44 not found: ID does not exist" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.974920 4906 scope.go:117] "RemoveContainer" containerID="a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073" Feb 21 00:23:04 crc kubenswrapper[4906]: E0221 00:23:04.975313 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073\": container with ID starting with a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073 not found: ID does not exist" containerID="a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.975344 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073"} err="failed to get container status \"a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073\": rpc error: code = NotFound desc = could not find container \"a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073\": container with ID starting with a7c5b5a573dd566f44e08524db2c65bb9840be9c4ef46853f48dba3b3e07a073 not found: ID does not exist" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.992140 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.992165 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5eb460f-49c0-483a-89c5-572d360c2850-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:23:04 crc kubenswrapper[4906]: I0221 00:23:04.992175 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7k6b\" (UniqueName: \"kubernetes.io/projected/d5eb460f-49c0-483a-89c5-572d360c2850-kube-api-access-q7k6b\") on node \"crc\" DevicePath \"\"" Feb 21 00:23:05 crc kubenswrapper[4906]: I0221 00:23:05.261201 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xkpld"] Feb 21 00:23:05 crc kubenswrapper[4906]: I0221 00:23:05.267509 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xkpld"] Feb 21 00:23:05 crc kubenswrapper[4906]: I0221 00:23:05.526327 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" path="/var/lib/kubelet/pods/d5eb460f-49c0-483a-89c5-572d360c2850/volumes" Feb 21 00:23:13 crc kubenswrapper[4906]: I0221 00:23:13.141669 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:23:13 crc kubenswrapper[4906]: I0221 00:23:13.142445 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:23:17 crc kubenswrapper[4906]: E0221 00:23:17.519879 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:23:28 crc kubenswrapper[4906]: E0221 00:23:28.520326 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:23:39 crc kubenswrapper[4906]: E0221 00:23:39.519762 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:23:43 crc kubenswrapper[4906]: I0221 00:23:43.124316 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:23:43 crc kubenswrapper[4906]: I0221 00:23:43.124803 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:23:43 crc kubenswrapper[4906]: I0221 00:23:43.124886 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:23:43 crc kubenswrapper[4906]: I0221 00:23:43.125567 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d21d8b8565b08c95fd01c5784995c0e3e1e16856e6912945637d28380ef3cb46"} pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 00:23:43 crc kubenswrapper[4906]: I0221 00:23:43.125659 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" containerID="cri-o://d21d8b8565b08c95fd01c5784995c0e3e1e16856e6912945637d28380ef3cb46" gracePeriod=600 Feb 21 00:23:44 crc kubenswrapper[4906]: I0221 00:23:44.197792 4906 generic.go:334] "Generic (PLEG): container finished" podID="17518505-fa81-4399-b6cd-5527dae35ef3" containerID="d21d8b8565b08c95fd01c5784995c0e3e1e16856e6912945637d28380ef3cb46" exitCode=0 Feb 21 00:23:44 crc kubenswrapper[4906]: I0221 00:23:44.197883 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerDied","Data":"d21d8b8565b08c95fd01c5784995c0e3e1e16856e6912945637d28380ef3cb46"} Feb 21 00:23:44 crc kubenswrapper[4906]: I0221 00:23:44.199902 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"97162606c9a977e3b770ba532a0511b268ac7e061a56e5a52cd6de61603ba900"} Feb 21 00:23:44 crc kubenswrapper[4906]: I0221 00:23:44.199934 4906 scope.go:117] "RemoveContainer" containerID="5ba16906db7f4c001a4d75d4a61fcd36e0dcef9b8dae4fc267bc90aeebbab0e4" Feb 21 00:23:51 crc kubenswrapper[4906]: E0221 00:23:51.521267 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:24:05 crc kubenswrapper[4906]: E0221 00:24:05.523046 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:24:17 crc kubenswrapper[4906]: E0221 00:24:17.520436 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:24:30 crc kubenswrapper[4906]: E0221 00:24:30.520948 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:24:44 crc kubenswrapper[4906]: E0221 00:24:44.522216 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:24:58 crc kubenswrapper[4906]: E0221 00:24:58.520927 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:25:11 crc kubenswrapper[4906]: E0221 00:25:11.518826 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:25:25 crc kubenswrapper[4906]: E0221 00:25:25.527253 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.232092 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-7gxrh"] Feb 21 00:25:33 crc kubenswrapper[4906]: E0221 00:25:33.233497 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" containerName="extract-content" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.233537 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" containerName="extract-content" Feb 21 00:25:33 crc kubenswrapper[4906]: E0221 00:25:33.233569 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" containerName="extract-utilities" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.233582 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" containerName="extract-utilities" Feb 21 00:25:33 crc kubenswrapper[4906]: E0221 00:25:33.233633 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" containerName="registry-server" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.233645 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" containerName="registry-server" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.233867 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5eb460f-49c0-483a-89c5-572d360c2850" containerName="registry-server" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.234596 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7gxrh" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.250779 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxhgc\" (UniqueName: \"kubernetes.io/projected/2447e8f2-533d-4a34-8379-fa94b6bd6d4f-kube-api-access-qxhgc\") pod \"infrawatch-operators-7gxrh\" (UID: \"2447e8f2-533d-4a34-8379-fa94b6bd6d4f\") " pod="service-telemetry/infrawatch-operators-7gxrh" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.251760 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-7gxrh"] Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.352243 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxhgc\" (UniqueName: \"kubernetes.io/projected/2447e8f2-533d-4a34-8379-fa94b6bd6d4f-kube-api-access-qxhgc\") pod \"infrawatch-operators-7gxrh\" (UID: \"2447e8f2-533d-4a34-8379-fa94b6bd6d4f\") " pod="service-telemetry/infrawatch-operators-7gxrh" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.384023 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxhgc\" (UniqueName: \"kubernetes.io/projected/2447e8f2-533d-4a34-8379-fa94b6bd6d4f-kube-api-access-qxhgc\") pod \"infrawatch-operators-7gxrh\" (UID: \"2447e8f2-533d-4a34-8379-fa94b6bd6d4f\") " pod="service-telemetry/infrawatch-operators-7gxrh" Feb 21 00:25:33 crc kubenswrapper[4906]: I0221 00:25:33.563984 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-7gxrh" Feb 21 00:25:34 crc kubenswrapper[4906]: I0221 00:25:34.058416 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-7gxrh"] Feb 21 00:25:34 crc kubenswrapper[4906]: W0221 00:25:34.059545 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447e8f2_533d_4a34_8379_fa94b6bd6d4f.slice/crio-3d12803aa2d31a221d6ef8e3d8ec493266ff47e7c4ab8727df7d9c7050cfd2cc WatchSource:0}: Error finding container 3d12803aa2d31a221d6ef8e3d8ec493266ff47e7c4ab8727df7d9c7050cfd2cc: Status 404 returned error can't find the container with id 3d12803aa2d31a221d6ef8e3d8ec493266ff47e7c4ab8727df7d9c7050cfd2cc Feb 21 00:25:34 crc kubenswrapper[4906]: E0221 00:25:34.100443 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:25:34 crc kubenswrapper[4906]: E0221 00:25:34.100660 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxhgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-7gxrh_service-telemetry(2447e8f2-533d-4a34-8379-fa94b6bd6d4f): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:25:34 crc kubenswrapper[4906]: E0221 00:25:34.102014 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:25:35 crc kubenswrapper[4906]: I0221 00:25:35.053162 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-7gxrh" event={"ID":"2447e8f2-533d-4a34-8379-fa94b6bd6d4f","Type":"ContainerStarted","Data":"3d12803aa2d31a221d6ef8e3d8ec493266ff47e7c4ab8727df7d9c7050cfd2cc"} Feb 21 00:25:35 crc kubenswrapper[4906]: E0221 00:25:35.056347 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:25:36 crc kubenswrapper[4906]: E0221 00:25:36.064315 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:25:37 crc kubenswrapper[4906]: E0221 00:25:37.519678 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.034889 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qznxp"] Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.036299 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.058412 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qznxp"] Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.170023 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngnx\" (UniqueName: \"kubernetes.io/projected/154e7a12-6669-4fed-a335-85ecb8d3daaa-kube-api-access-cngnx\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.170076 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-catalog-content\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.170108 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-utilities\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.271715 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-catalog-content\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.271840 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-utilities\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.271992 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cngnx\" (UniqueName: \"kubernetes.io/projected/154e7a12-6669-4fed-a335-85ecb8d3daaa-kube-api-access-cngnx\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.272278 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-catalog-content\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.272467 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-utilities\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.301589 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngnx\" (UniqueName: \"kubernetes.io/projected/154e7a12-6669-4fed-a335-85ecb8d3daaa-kube-api-access-cngnx\") pod \"certified-operators-qznxp\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.369810 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:38 crc kubenswrapper[4906]: I0221 00:25:38.625322 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qznxp"] Feb 21 00:25:39 crc kubenswrapper[4906]: I0221 00:25:39.091263 4906 generic.go:334] "Generic (PLEG): container finished" podID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerID="104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41" exitCode=0 Feb 21 00:25:39 crc kubenswrapper[4906]: I0221 00:25:39.091327 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznxp" event={"ID":"154e7a12-6669-4fed-a335-85ecb8d3daaa","Type":"ContainerDied","Data":"104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41"} Feb 21 00:25:39 crc kubenswrapper[4906]: I0221 00:25:39.091368 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznxp" event={"ID":"154e7a12-6669-4fed-a335-85ecb8d3daaa","Type":"ContainerStarted","Data":"4475106c7da59f73520b931a0aa37c2a5c621d092d06a6429b7a8f797a74b0ed"} Feb 21 00:25:40 crc kubenswrapper[4906]: I0221 00:25:40.100050 4906 generic.go:334] "Generic (PLEG): container finished" podID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerID="93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26" exitCode=0 Feb 21 00:25:40 crc kubenswrapper[4906]: I0221 00:25:40.100106 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznxp" event={"ID":"154e7a12-6669-4fed-a335-85ecb8d3daaa","Type":"ContainerDied","Data":"93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26"} Feb 21 00:25:41 crc kubenswrapper[4906]: I0221 00:25:41.110845 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznxp" event={"ID":"154e7a12-6669-4fed-a335-85ecb8d3daaa","Type":"ContainerStarted","Data":"1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164"} Feb 21 00:25:41 crc kubenswrapper[4906]: I0221 00:25:41.133250 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qznxp" podStartSLOduration=1.715801873 podStartE2EDuration="3.133224823s" podCreationTimestamp="2026-02-21 00:25:38 +0000 UTC" firstStartedPulling="2026-02-21 00:25:39.093665891 +0000 UTC m=+1074.345253437" lastFinishedPulling="2026-02-21 00:25:40.511088881 +0000 UTC m=+1075.762676387" observedRunningTime="2026-02-21 00:25:41.13136451 +0000 UTC m=+1076.382952026" watchObservedRunningTime="2026-02-21 00:25:41.133224823 +0000 UTC m=+1076.384812349" Feb 21 00:25:43 crc kubenswrapper[4906]: I0221 00:25:43.123670 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:25:43 crc kubenswrapper[4906]: I0221 00:25:43.124116 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:25:46 crc kubenswrapper[4906]: E0221 00:25:46.567257 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:25:46 crc kubenswrapper[4906]: E0221 00:25:46.568024 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxhgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-7gxrh_service-telemetry(2447e8f2-533d-4a34-8379-fa94b6bd6d4f): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:25:46 crc kubenswrapper[4906]: E0221 00:25:46.569960 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:25:48 crc kubenswrapper[4906]: I0221 00:25:48.370941 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:48 crc kubenswrapper[4906]: I0221 00:25:48.371278 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:48 crc kubenswrapper[4906]: I0221 00:25:48.439049 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:49 crc kubenswrapper[4906]: I0221 00:25:49.238370 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:50 crc kubenswrapper[4906]: I0221 00:25:50.417819 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qznxp"] Feb 21 00:25:50 crc kubenswrapper[4906]: E0221 00:25:50.556534 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:25:50 crc kubenswrapper[4906]: E0221 00:25:50.556872 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8s9w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-l49m7_service-telemetry(6acb63f4-309c-49bc-b8e6-007db92e699a): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:25:50 crc kubenswrapper[4906]: E0221 00:25:50.558859 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.184458 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qznxp" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerName="registry-server" containerID="cri-o://1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164" gracePeriod=2 Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.581052 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.663493 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-catalog-content\") pod \"154e7a12-6669-4fed-a335-85ecb8d3daaa\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.663578 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cngnx\" (UniqueName: \"kubernetes.io/projected/154e7a12-6669-4fed-a335-85ecb8d3daaa-kube-api-access-cngnx\") pod \"154e7a12-6669-4fed-a335-85ecb8d3daaa\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.663752 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-utilities\") pod \"154e7a12-6669-4fed-a335-85ecb8d3daaa\" (UID: \"154e7a12-6669-4fed-a335-85ecb8d3daaa\") " Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.664884 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-utilities" (OuterVolumeSpecName: "utilities") pod "154e7a12-6669-4fed-a335-85ecb8d3daaa" (UID: "154e7a12-6669-4fed-a335-85ecb8d3daaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.669841 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154e7a12-6669-4fed-a335-85ecb8d3daaa-kube-api-access-cngnx" (OuterVolumeSpecName: "kube-api-access-cngnx") pod "154e7a12-6669-4fed-a335-85ecb8d3daaa" (UID: "154e7a12-6669-4fed-a335-85ecb8d3daaa"). InnerVolumeSpecName "kube-api-access-cngnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.729792 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "154e7a12-6669-4fed-a335-85ecb8d3daaa" (UID: "154e7a12-6669-4fed-a335-85ecb8d3daaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.765874 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.765923 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cngnx\" (UniqueName: \"kubernetes.io/projected/154e7a12-6669-4fed-a335-85ecb8d3daaa-kube-api-access-cngnx\") on node \"crc\" DevicePath \"\"" Feb 21 00:25:51 crc kubenswrapper[4906]: I0221 00:25:51.765934 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154e7a12-6669-4fed-a335-85ecb8d3daaa-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.195781 4906 generic.go:334] "Generic (PLEG): container finished" podID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerID="1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164" exitCode=0 Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.195856 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznxp" event={"ID":"154e7a12-6669-4fed-a335-85ecb8d3daaa","Type":"ContainerDied","Data":"1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164"} Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.195906 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qznxp" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.195935 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qznxp" event={"ID":"154e7a12-6669-4fed-a335-85ecb8d3daaa","Type":"ContainerDied","Data":"4475106c7da59f73520b931a0aa37c2a5c621d092d06a6429b7a8f797a74b0ed"} Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.195976 4906 scope.go:117] "RemoveContainer" containerID="1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.230248 4906 scope.go:117] "RemoveContainer" containerID="93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.252075 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qznxp"] Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.264709 4906 scope.go:117] "RemoveContainer" containerID="104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.264910 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qznxp"] Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.291086 4906 scope.go:117] "RemoveContainer" containerID="1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164" Feb 21 00:25:52 crc kubenswrapper[4906]: E0221 00:25:52.291624 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164\": container with ID starting with 1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164 not found: ID does not exist" containerID="1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.291677 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164"} err="failed to get container status \"1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164\": rpc error: code = NotFound desc = could not find container \"1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164\": container with ID starting with 1024177a4851dfc0ac9b817a4d9427f8c4ece6657b90b04105a9345411cca164 not found: ID does not exist" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.291730 4906 scope.go:117] "RemoveContainer" containerID="93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26" Feb 21 00:25:52 crc kubenswrapper[4906]: E0221 00:25:52.292195 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26\": container with ID starting with 93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26 not found: ID does not exist" containerID="93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.292228 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26"} err="failed to get container status \"93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26\": rpc error: code = NotFound desc = could not find container \"93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26\": container with ID starting with 93f11dd841580a1480c6ed73d768febce533b219916dcd469a36c3cedc605d26 not found: ID does not exist" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.292249 4906 scope.go:117] "RemoveContainer" containerID="104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41" Feb 21 00:25:52 crc kubenswrapper[4906]: E0221 00:25:52.292772 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41\": container with ID starting with 104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41 not found: ID does not exist" containerID="104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41" Feb 21 00:25:52 crc kubenswrapper[4906]: I0221 00:25:52.292805 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41"} err="failed to get container status \"104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41\": rpc error: code = NotFound desc = could not find container \"104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41\": container with ID starting with 104d69a3942b905235e601655b0ae614befa6d891702fe4023257041aeda8a41 not found: ID does not exist" Feb 21 00:25:53 crc kubenswrapper[4906]: I0221 00:25:53.532678 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" path="/var/lib/kubelet/pods/154e7a12-6669-4fed-a335-85ecb8d3daaa/volumes" Feb 21 00:25:57 crc kubenswrapper[4906]: E0221 00:25:57.519571 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:26:04 crc kubenswrapper[4906]: E0221 00:26:04.518974 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:26:10 crc kubenswrapper[4906]: E0221 00:26:10.571063 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:26:10 crc kubenswrapper[4906]: E0221 00:26:10.571839 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxhgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-7gxrh_service-telemetry(2447e8f2-533d-4a34-8379-fa94b6bd6d4f): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:26:10 crc kubenswrapper[4906]: E0221 00:26:10.573153 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:26:13 crc kubenswrapper[4906]: I0221 00:26:13.124636 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:26:13 crc kubenswrapper[4906]: I0221 00:26:13.125105 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:26:19 crc kubenswrapper[4906]: E0221 00:26:19.519874 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:26:22 crc kubenswrapper[4906]: E0221 00:26:22.519838 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:26:31 crc kubenswrapper[4906]: E0221 00:26:31.519609 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:26:37 crc kubenswrapper[4906]: E0221 00:26:37.519557 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.124129 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.124518 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.124583 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.125324 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97162606c9a977e3b770ba532a0511b268ac7e061a56e5a52cd6de61603ba900"} pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.125419 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" containerID="cri-o://97162606c9a977e3b770ba532a0511b268ac7e061a56e5a52cd6de61603ba900" gracePeriod=600 Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.568958 4906 generic.go:334] "Generic (PLEG): container finished" podID="17518505-fa81-4399-b6cd-5527dae35ef3" containerID="97162606c9a977e3b770ba532a0511b268ac7e061a56e5a52cd6de61603ba900" exitCode=0 Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.569040 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerDied","Data":"97162606c9a977e3b770ba532a0511b268ac7e061a56e5a52cd6de61603ba900"} Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.569243 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"8a28c43ee19ee8e1b60675af1a84b261d9c1c81ae56a39e6e1b8e0cd64dab482"} Feb 21 00:26:43 crc kubenswrapper[4906]: I0221 00:26:43.569268 4906 scope.go:117] "RemoveContainer" containerID="d21d8b8565b08c95fd01c5784995c0e3e1e16856e6912945637d28380ef3cb46" Feb 21 00:26:46 crc kubenswrapper[4906]: E0221 00:26:46.519738 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:26:48 crc kubenswrapper[4906]: E0221 00:26:48.519740 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:27:01 crc kubenswrapper[4906]: E0221 00:27:01.520636 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:27:03 crc kubenswrapper[4906]: E0221 00:27:03.569403 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:27:03 crc kubenswrapper[4906]: E0221 00:27:03.570023 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxhgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-7gxrh_service-telemetry(2447e8f2-533d-4a34-8379-fa94b6bd6d4f): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:27:03 crc kubenswrapper[4906]: E0221 00:27:03.571580 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:27:16 crc kubenswrapper[4906]: E0221 00:27:16.519981 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:27:16 crc kubenswrapper[4906]: E0221 00:27:16.520479 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:27:28 crc kubenswrapper[4906]: E0221 00:27:28.521657 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:27:29 crc kubenswrapper[4906]: E0221 00:27:29.519461 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:27:40 crc kubenswrapper[4906]: E0221 00:27:40.519572 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:27:41 crc kubenswrapper[4906]: E0221 00:27:41.518985 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:27:51 crc kubenswrapper[4906]: E0221 00:27:51.519111 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:27:54 crc kubenswrapper[4906]: E0221 00:27:54.519173 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:28:05 crc kubenswrapper[4906]: E0221 00:28:05.525910 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:28:08 crc kubenswrapper[4906]: E0221 00:28:08.519296 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:28:20 crc kubenswrapper[4906]: E0221 00:28:20.519821 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:28:23 crc kubenswrapper[4906]: E0221 00:28:23.521986 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:28:34 crc kubenswrapper[4906]: I0221 00:28:34.520632 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 00:28:34 crc kubenswrapper[4906]: E0221 00:28:34.570128 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:28:34 crc kubenswrapper[4906]: E0221 00:28:34.570454 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxhgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-7gxrh_service-telemetry(2447e8f2-533d-4a34-8379-fa94b6bd6d4f): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:28:34 crc kubenswrapper[4906]: E0221 00:28:34.571750 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:28:36 crc kubenswrapper[4906]: E0221 00:28:36.519517 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:28:43 crc kubenswrapper[4906]: I0221 00:28:43.124480 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:28:43 crc kubenswrapper[4906]: I0221 00:28:43.124893 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:28:48 crc kubenswrapper[4906]: E0221 00:28:48.521259 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:28:51 crc kubenswrapper[4906]: E0221 00:28:51.519984 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:28:59 crc kubenswrapper[4906]: E0221 00:28:59.519713 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:29:05 crc kubenswrapper[4906]: E0221 00:29:05.524531 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:29:13 crc kubenswrapper[4906]: I0221 00:29:13.124157 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:29:13 crc kubenswrapper[4906]: I0221 00:29:13.124904 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:29:14 crc kubenswrapper[4906]: E0221 00:29:14.518744 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:29:17 crc kubenswrapper[4906]: E0221 00:29:17.519513 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:29:28 crc kubenswrapper[4906]: E0221 00:29:28.520398 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:29:28 crc kubenswrapper[4906]: E0221 00:29:28.520575 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:29:41 crc kubenswrapper[4906]: E0221 00:29:41.520407 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:29:41 crc kubenswrapper[4906]: E0221 00:29:41.520579 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.124784 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.124882 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.124954 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.125758 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a28c43ee19ee8e1b60675af1a84b261d9c1c81ae56a39e6e1b8e0cd64dab482"} pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.125891 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" containerID="cri-o://8a28c43ee19ee8e1b60675af1a84b261d9c1c81ae56a39e6e1b8e0cd64dab482" gracePeriod=600 Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.908463 4906 generic.go:334] "Generic (PLEG): container finished" podID="17518505-fa81-4399-b6cd-5527dae35ef3" containerID="8a28c43ee19ee8e1b60675af1a84b261d9c1c81ae56a39e6e1b8e0cd64dab482" exitCode=0 Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.908545 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerDied","Data":"8a28c43ee19ee8e1b60675af1a84b261d9c1c81ae56a39e6e1b8e0cd64dab482"} Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.908870 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerStarted","Data":"6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16"} Feb 21 00:29:43 crc kubenswrapper[4906]: I0221 00:29:43.908905 4906 scope.go:117] "RemoveContainer" containerID="97162606c9a977e3b770ba532a0511b268ac7e061a56e5a52cd6de61603ba900" Feb 21 00:29:52 crc kubenswrapper[4906]: E0221 00:29:52.520108 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:29:54 crc kubenswrapper[4906]: E0221 00:29:54.519565 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.183428 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9"] Feb 21 00:30:00 crc kubenswrapper[4906]: E0221 00:30:00.184113 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerName="extract-content" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.184132 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerName="extract-content" Feb 21 00:30:00 crc kubenswrapper[4906]: E0221 00:30:00.184152 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerName="registry-server" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.184162 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerName="registry-server" Feb 21 00:30:00 crc kubenswrapper[4906]: E0221 00:30:00.184194 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerName="extract-utilities" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.184207 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerName="extract-utilities" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.184352 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="154e7a12-6669-4fed-a335-85ecb8d3daaa" containerName="registry-server" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.185084 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.187585 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.187906 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.192362 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9"] Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.266396 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdca039-40b6-4e27-ae85-199578bd286a-secret-volume\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.266500 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsq2w\" (UniqueName: \"kubernetes.io/projected/5fdca039-40b6-4e27-ae85-199578bd286a-kube-api-access-qsq2w\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.266572 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdca039-40b6-4e27-ae85-199578bd286a-config-volume\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.367473 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdca039-40b6-4e27-ae85-199578bd286a-secret-volume\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.367577 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsq2w\" (UniqueName: \"kubernetes.io/projected/5fdca039-40b6-4e27-ae85-199578bd286a-kube-api-access-qsq2w\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.367635 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdca039-40b6-4e27-ae85-199578bd286a-config-volume\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.369606 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdca039-40b6-4e27-ae85-199578bd286a-config-volume\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.374726 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdca039-40b6-4e27-ae85-199578bd286a-secret-volume\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.387000 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsq2w\" (UniqueName: \"kubernetes.io/projected/5fdca039-40b6-4e27-ae85-199578bd286a-kube-api-access-qsq2w\") pod \"collect-profiles-29527230-sbcb9\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.510127 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:00 crc kubenswrapper[4906]: I0221 00:30:00.791946 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9"] Feb 21 00:30:01 crc kubenswrapper[4906]: I0221 00:30:01.054873 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" event={"ID":"5fdca039-40b6-4e27-ae85-199578bd286a","Type":"ContainerStarted","Data":"9d9d4b523b54c032fe3a5d2b1848a206123d5eab0898f429aabf7904999d221c"} Feb 21 00:30:01 crc kubenswrapper[4906]: I0221 00:30:01.055370 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" event={"ID":"5fdca039-40b6-4e27-ae85-199578bd286a","Type":"ContainerStarted","Data":"a1a1c34d39a2f014c0088e677955363bfd6d13cc2f30e5d5fca74d8e2e0f22b6"} Feb 21 00:30:01 crc kubenswrapper[4906]: I0221 00:30:01.077150 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" podStartSLOduration=1.077120561 podStartE2EDuration="1.077120561s" podCreationTimestamp="2026-02-21 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 00:30:01.07214665 +0000 UTC m=+1336.323734186" watchObservedRunningTime="2026-02-21 00:30:01.077120561 +0000 UTC m=+1336.328708077" Feb 21 00:30:02 crc kubenswrapper[4906]: I0221 00:30:02.064563 4906 generic.go:334] "Generic (PLEG): container finished" podID="5fdca039-40b6-4e27-ae85-199578bd286a" containerID="9d9d4b523b54c032fe3a5d2b1848a206123d5eab0898f429aabf7904999d221c" exitCode=0 Feb 21 00:30:02 crc kubenswrapper[4906]: I0221 00:30:02.064611 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" event={"ID":"5fdca039-40b6-4e27-ae85-199578bd286a","Type":"ContainerDied","Data":"9d9d4b523b54c032fe3a5d2b1848a206123d5eab0898f429aabf7904999d221c"} Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.301838 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.408484 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdca039-40b6-4e27-ae85-199578bd286a-secret-volume\") pod \"5fdca039-40b6-4e27-ae85-199578bd286a\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.408619 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsq2w\" (UniqueName: \"kubernetes.io/projected/5fdca039-40b6-4e27-ae85-199578bd286a-kube-api-access-qsq2w\") pod \"5fdca039-40b6-4e27-ae85-199578bd286a\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.408743 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdca039-40b6-4e27-ae85-199578bd286a-config-volume\") pod \"5fdca039-40b6-4e27-ae85-199578bd286a\" (UID: \"5fdca039-40b6-4e27-ae85-199578bd286a\") " Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.409302 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fdca039-40b6-4e27-ae85-199578bd286a-config-volume" (OuterVolumeSpecName: "config-volume") pod "5fdca039-40b6-4e27-ae85-199578bd286a" (UID: "5fdca039-40b6-4e27-ae85-199578bd286a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.413532 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdca039-40b6-4e27-ae85-199578bd286a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5fdca039-40b6-4e27-ae85-199578bd286a" (UID: "5fdca039-40b6-4e27-ae85-199578bd286a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.413677 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fdca039-40b6-4e27-ae85-199578bd286a-kube-api-access-qsq2w" (OuterVolumeSpecName: "kube-api-access-qsq2w") pod "5fdca039-40b6-4e27-ae85-199578bd286a" (UID: "5fdca039-40b6-4e27-ae85-199578bd286a"). InnerVolumeSpecName "kube-api-access-qsq2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.511165 4906 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdca039-40b6-4e27-ae85-199578bd286a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.511226 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsq2w\" (UniqueName: \"kubernetes.io/projected/5fdca039-40b6-4e27-ae85-199578bd286a-kube-api-access-qsq2w\") on node \"crc\" DevicePath \"\"" Feb 21 00:30:03 crc kubenswrapper[4906]: I0221 00:30:03.511255 4906 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdca039-40b6-4e27-ae85-199578bd286a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 00:30:03 crc kubenswrapper[4906]: E0221 00:30:03.520567 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:30:04 crc kubenswrapper[4906]: I0221 00:30:04.079959 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" event={"ID":"5fdca039-40b6-4e27-ae85-199578bd286a","Type":"ContainerDied","Data":"a1a1c34d39a2f014c0088e677955363bfd6d13cc2f30e5d5fca74d8e2e0f22b6"} Feb 21 00:30:04 crc kubenswrapper[4906]: I0221 00:30:04.080015 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1a1c34d39a2f014c0088e677955363bfd6d13cc2f30e5d5fca74d8e2e0f22b6" Feb 21 00:30:04 crc kubenswrapper[4906]: I0221 00:30:04.080162 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29527230-sbcb9" Feb 21 00:30:06 crc kubenswrapper[4906]: E0221 00:30:06.519544 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:30:14 crc kubenswrapper[4906]: E0221 00:30:14.519172 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:30:18 crc kubenswrapper[4906]: E0221 00:30:18.521062 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:30:26 crc kubenswrapper[4906]: E0221 00:30:26.520588 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:30:33 crc kubenswrapper[4906]: E0221 00:30:33.521527 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:30:39 crc kubenswrapper[4906]: E0221 00:30:39.519510 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:30:46 crc kubenswrapper[4906]: E0221 00:30:46.522176 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:30:51 crc kubenswrapper[4906]: E0221 00:30:51.520371 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:30:59 crc kubenswrapper[4906]: E0221 00:30:59.574718 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:30:59 crc kubenswrapper[4906]: E0221 00:30:59.575550 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8s9w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-l49m7_service-telemetry(6acb63f4-309c-49bc-b8e6-007db92e699a): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:30:59 crc kubenswrapper[4906]: E0221 00:30:59.576782 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:31:04 crc kubenswrapper[4906]: E0221 00:31:04.518308 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:31:13 crc kubenswrapper[4906]: E0221 00:31:13.521236 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.096195 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vt6l9/must-gather-z6gz2"] Feb 21 00:31:16 crc kubenswrapper[4906]: E0221 00:31:16.097409 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdca039-40b6-4e27-ae85-199578bd286a" containerName="collect-profiles" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.097509 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdca039-40b6-4e27-ae85-199578bd286a" containerName="collect-profiles" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.097772 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdca039-40b6-4e27-ae85-199578bd286a" containerName="collect-profiles" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.098569 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.104113 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vt6l9"/"kube-root-ca.crt" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.104315 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-must-gather-output\") pod \"must-gather-z6gz2\" (UID: \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\") " pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.104507 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k52f4\" (UniqueName: \"kubernetes.io/projected/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-kube-api-access-k52f4\") pod \"must-gather-z6gz2\" (UID: \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\") " pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.112132 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vt6l9"/"openshift-service-ca.crt" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.112464 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vt6l9"/"default-dockercfg-7jlgz" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.116723 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vt6l9/must-gather-z6gz2"] Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.205457 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k52f4\" (UniqueName: \"kubernetes.io/projected/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-kube-api-access-k52f4\") pod \"must-gather-z6gz2\" (UID: \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\") " pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.205594 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-must-gather-output\") pod \"must-gather-z6gz2\" (UID: \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\") " pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.206424 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-must-gather-output\") pod \"must-gather-z6gz2\" (UID: \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\") " pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.236212 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k52f4\" (UniqueName: \"kubernetes.io/projected/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-kube-api-access-k52f4\") pod \"must-gather-z6gz2\" (UID: \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\") " pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.417334 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:31:16 crc kubenswrapper[4906]: I0221 00:31:16.613248 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vt6l9/must-gather-z6gz2"] Feb 21 00:31:17 crc kubenswrapper[4906]: E0221 00:31:17.554233 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:31:17 crc kubenswrapper[4906]: E0221 00:31:17.554843 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxhgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-7gxrh_service-telemetry(2447e8f2-533d-4a34-8379-fa94b6bd6d4f): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:31:17 crc kubenswrapper[4906]: E0221 00:31:17.556097 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:31:17 crc kubenswrapper[4906]: I0221 00:31:17.615439 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" event={"ID":"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4","Type":"ContainerStarted","Data":"d0be8da9f9fa9dc8fe9f729bfbff45029a1390b0860020c01d9807055ac32d51"} Feb 21 00:31:25 crc kubenswrapper[4906]: I0221 00:31:25.673826 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" event={"ID":"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4","Type":"ContainerStarted","Data":"8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78"} Feb 21 00:31:25 crc kubenswrapper[4906]: I0221 00:31:25.674348 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" event={"ID":"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4","Type":"ContainerStarted","Data":"0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05"} Feb 21 00:31:25 crc kubenswrapper[4906]: I0221 00:31:25.689457 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" podStartSLOduration=1.446451816 podStartE2EDuration="9.68943652s" podCreationTimestamp="2026-02-21 00:31:16 +0000 UTC" firstStartedPulling="2026-02-21 00:31:16.620368425 +0000 UTC m=+1411.871955931" lastFinishedPulling="2026-02-21 00:31:24.863353099 +0000 UTC m=+1420.114940635" observedRunningTime="2026-02-21 00:31:25.685341104 +0000 UTC m=+1420.936928610" watchObservedRunningTime="2026-02-21 00:31:25.68943652 +0000 UTC m=+1420.941024046" Feb 21 00:31:27 crc kubenswrapper[4906]: E0221 00:31:27.519662 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:31:28 crc kubenswrapper[4906]: E0221 00:31:28.519656 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:31:40 crc kubenswrapper[4906]: E0221 00:31:40.519769 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:31:43 crc kubenswrapper[4906]: I0221 00:31:43.123771 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:31:43 crc kubenswrapper[4906]: I0221 00:31:43.123832 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:31:43 crc kubenswrapper[4906]: E0221 00:31:43.518297 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:31:52 crc kubenswrapper[4906]: E0221 00:31:52.519134 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:31:54 crc kubenswrapper[4906]: E0221 00:31:54.518545 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:32:05 crc kubenswrapper[4906]: E0221 00:32:05.525101 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:32:07 crc kubenswrapper[4906]: E0221 00:32:07.519937 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:32:09 crc kubenswrapper[4906]: I0221 00:32:09.753334 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fdrrq_b6ee1e80-1e94-4513-903d-0478a92070b4/control-plane-machine-set-operator/0.log" Feb 21 00:32:09 crc kubenswrapper[4906]: I0221 00:32:09.903436 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bbxgp_5bc1e348-33c5-4bfa-984f-312b58bff4cd/kube-rbac-proxy/0.log" Feb 21 00:32:09 crc kubenswrapper[4906]: I0221 00:32:09.921635 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bbxgp_5bc1e348-33c5-4bfa-984f-312b58bff4cd/machine-api-operator/0.log" Feb 21 00:32:13 crc kubenswrapper[4906]: I0221 00:32:13.123554 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:32:13 crc kubenswrapper[4906]: I0221 00:32:13.123958 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:32:19 crc kubenswrapper[4906]: E0221 00:32:19.520036 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:32:22 crc kubenswrapper[4906]: I0221 00:32:22.459296 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-sqwxp_7233cb65-44b5-4f88-9aa4-dd37bfdb0f2d/cert-manager-controller/0.log" Feb 21 00:32:22 crc kubenswrapper[4906]: E0221 00:32:22.519421 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:32:22 crc kubenswrapper[4906]: I0221 00:32:22.629851 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-jlzgl_6cc213c0-dc6e-4284-b3ef-4315fce21686/cert-manager-cainjector/0.log" Feb 21 00:32:22 crc kubenswrapper[4906]: I0221 00:32:22.675066 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-fvzcd_8c4cfe95-3d03-4fc5-a750-8e7580361fa2/cert-manager-webhook/0.log" Feb 21 00:32:31 crc kubenswrapper[4906]: E0221 00:32:31.518961 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:32:34 crc kubenswrapper[4906]: E0221 00:32:34.518270 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:32:36 crc kubenswrapper[4906]: I0221 00:32:36.208188 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-7lwz7_8600dc47-ea8b-4033-b9cc-fbb62f54e36e/prometheus-operator/0.log" Feb 21 00:32:36 crc kubenswrapper[4906]: I0221 00:32:36.307231 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86d9894984-2k8rf_b36dc052-4396-45b6-9169-1efee41b5e32/prometheus-operator-admission-webhook/0.log" Feb 21 00:32:36 crc kubenswrapper[4906]: I0221 00:32:36.381002 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86d9894984-bcvsj_93269518-8ff5-4e82-8f60-1a3382b87720/prometheus-operator-admission-webhook/0.log" Feb 21 00:32:36 crc kubenswrapper[4906]: I0221 00:32:36.485402 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-khgkm_bc85b339-0a0f-4745-88fa-8b48af27f6be/operator/0.log" Feb 21 00:32:36 crc kubenswrapper[4906]: I0221 00:32:36.572961 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-xbkjz_34cdaea4-84d8-4753-be71-0b73d7e9d1ba/perses-operator/0.log" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.425440 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dffn4"] Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.427106 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.437819 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dffn4"] Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.579275 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5bqk\" (UniqueName: \"kubernetes.io/projected/81f09ea7-f543-4bac-821d-e8284c2685c1-kube-api-access-b5bqk\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.579578 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-catalog-content\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.579845 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-utilities\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.680950 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-utilities\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.681004 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5bqk\" (UniqueName: \"kubernetes.io/projected/81f09ea7-f543-4bac-821d-e8284c2685c1-kube-api-access-b5bqk\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.681025 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-catalog-content\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.681496 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-utilities\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.681535 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-catalog-content\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.706617 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5bqk\" (UniqueName: \"kubernetes.io/projected/81f09ea7-f543-4bac-821d-e8284c2685c1-kube-api-access-b5bqk\") pod \"redhat-operators-dffn4\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:40 crc kubenswrapper[4906]: I0221 00:32:40.795148 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:41 crc kubenswrapper[4906]: I0221 00:32:41.011246 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dffn4"] Feb 21 00:32:41 crc kubenswrapper[4906]: I0221 00:32:41.185191 4906 generic.go:334] "Generic (PLEG): container finished" podID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerID="5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd" exitCode=0 Feb 21 00:32:41 crc kubenswrapper[4906]: I0221 00:32:41.185235 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dffn4" event={"ID":"81f09ea7-f543-4bac-821d-e8284c2685c1","Type":"ContainerDied","Data":"5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd"} Feb 21 00:32:41 crc kubenswrapper[4906]: I0221 00:32:41.185260 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dffn4" event={"ID":"81f09ea7-f543-4bac-821d-e8284c2685c1","Type":"ContainerStarted","Data":"5b50945694a6efdffcef697db36a8e0b37fd156741a979dd5a60226581d87ec6"} Feb 21 00:32:42 crc kubenswrapper[4906]: I0221 00:32:42.192791 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dffn4" event={"ID":"81f09ea7-f543-4bac-821d-e8284c2685c1","Type":"ContainerStarted","Data":"7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122"} Feb 21 00:32:43 crc kubenswrapper[4906]: I0221 00:32:43.123788 4906 patch_prober.go:28] interesting pod/machine-config-daemon-b9qdv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 00:32:43 crc kubenswrapper[4906]: I0221 00:32:43.124077 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 00:32:43 crc kubenswrapper[4906]: I0221 00:32:43.124205 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" Feb 21 00:32:43 crc kubenswrapper[4906]: I0221 00:32:43.124969 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16"} pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 00:32:43 crc kubenswrapper[4906]: I0221 00:32:43.125148 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" containerName="machine-config-daemon" containerID="cri-o://6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" gracePeriod=600 Feb 21 00:32:43 crc kubenswrapper[4906]: I0221 00:32:43.206376 4906 generic.go:334] "Generic (PLEG): container finished" podID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerID="7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122" exitCode=0 Feb 21 00:32:43 crc kubenswrapper[4906]: I0221 00:32:43.206438 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dffn4" event={"ID":"81f09ea7-f543-4bac-821d-e8284c2685c1","Type":"ContainerDied","Data":"7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122"} Feb 21 00:32:43 crc kubenswrapper[4906]: E0221 00:32:43.259442 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:32:44 crc kubenswrapper[4906]: I0221 00:32:44.215021 4906 generic.go:334] "Generic (PLEG): container finished" podID="17518505-fa81-4399-b6cd-5527dae35ef3" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" exitCode=0 Feb 21 00:32:44 crc kubenswrapper[4906]: I0221 00:32:44.215112 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" event={"ID":"17518505-fa81-4399-b6cd-5527dae35ef3","Type":"ContainerDied","Data":"6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16"} Feb 21 00:32:44 crc kubenswrapper[4906]: I0221 00:32:44.215461 4906 scope.go:117] "RemoveContainer" containerID="8a28c43ee19ee8e1b60675af1a84b261d9c1c81ae56a39e6e1b8e0cd64dab482" Feb 21 00:32:44 crc kubenswrapper[4906]: I0221 00:32:44.216035 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:32:44 crc kubenswrapper[4906]: E0221 00:32:44.216320 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:32:44 crc kubenswrapper[4906]: I0221 00:32:44.218011 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dffn4" event={"ID":"81f09ea7-f543-4bac-821d-e8284c2685c1","Type":"ContainerStarted","Data":"d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40"} Feb 21 00:32:44 crc kubenswrapper[4906]: I0221 00:32:44.256270 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dffn4" podStartSLOduration=1.817072069 podStartE2EDuration="4.256246684s" podCreationTimestamp="2026-02-21 00:32:40 +0000 UTC" firstStartedPulling="2026-02-21 00:32:41.188121796 +0000 UTC m=+1496.439709302" lastFinishedPulling="2026-02-21 00:32:43.627296411 +0000 UTC m=+1498.878883917" observedRunningTime="2026-02-21 00:32:44.252380344 +0000 UTC m=+1499.503967860" watchObservedRunningTime="2026-02-21 00:32:44.256246684 +0000 UTC m=+1499.507834190" Feb 21 00:32:45 crc kubenswrapper[4906]: E0221 00:32:45.521843 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:32:48 crc kubenswrapper[4906]: E0221 00:32:48.519143 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:32:50 crc kubenswrapper[4906]: I0221 00:32:50.796670 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:50 crc kubenswrapper[4906]: I0221 00:32:50.797098 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:50 crc kubenswrapper[4906]: I0221 00:32:50.839937 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:51 crc kubenswrapper[4906]: I0221 00:32:51.304867 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:51 crc kubenswrapper[4906]: I0221 00:32:51.349293 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dffn4"] Feb 21 00:32:51 crc kubenswrapper[4906]: I0221 00:32:51.823503 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg_94d58122-e78f-4369-937c-ba79bf5eec59/util/0.log" Feb 21 00:32:51 crc kubenswrapper[4906]: I0221 00:32:51.998805 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg_94d58122-e78f-4369-937c-ba79bf5eec59/pull/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.028950 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg_94d58122-e78f-4369-937c-ba79bf5eec59/util/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.052217 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg_94d58122-e78f-4369-937c-ba79bf5eec59/pull/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.197528 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg_94d58122-e78f-4369-937c-ba79bf5eec59/util/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.217211 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg_94d58122-e78f-4369-937c-ba79bf5eec59/pull/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.228646 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1xsfrg_94d58122-e78f-4369-937c-ba79bf5eec59/extract/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.404860 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5_a91476fa-3559-46e8-a2de-14d0d36d2cad/util/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.641489 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5_a91476fa-3559-46e8-a2de-14d0d36d2cad/util/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.657732 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5_a91476fa-3559-46e8-a2de-14d0d36d2cad/pull/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.666739 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5_a91476fa-3559-46e8-a2de-14d0d36d2cad/pull/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.829254 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5_a91476fa-3559-46e8-a2de-14d0d36d2cad/util/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.834334 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5_a91476fa-3559-46e8-a2de-14d0d36d2cad/extract/0.log" Feb 21 00:32:52 crc kubenswrapper[4906]: I0221 00:32:52.875320 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e56bzg5_a91476fa-3559-46e8-a2de-14d0d36d2cad/pull/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.026025 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb_d31a01da-9278-4a4c-a880-60bbacec4d63/util/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.161799 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb_d31a01da-9278-4a4c-a880-60bbacec4d63/pull/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.198247 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb_d31a01da-9278-4a4c-a880-60bbacec4d63/pull/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.216072 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb_d31a01da-9278-4a4c-a880-60bbacec4d63/util/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.274038 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dffn4" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerName="registry-server" containerID="cri-o://d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40" gracePeriod=2 Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.364118 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb_d31a01da-9278-4a4c-a880-60bbacec4d63/util/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.411949 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb_d31a01da-9278-4a4c-a880-60bbacec4d63/extract/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.441249 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08bzfxb_d31a01da-9278-4a4c-a880-60bbacec4d63/pull/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.588321 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w7db6_19b5471c-7b90-4839-91c4-31e5f98c3959/extract-utilities/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.726896 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w7db6_19b5471c-7b90-4839-91c4-31e5f98c3959/extract-content/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.743457 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w7db6_19b5471c-7b90-4839-91c4-31e5f98c3959/extract-utilities/0.log" Feb 21 00:32:53 crc kubenswrapper[4906]: I0221 00:32:53.747815 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w7db6_19b5471c-7b90-4839-91c4-31e5f98c3959/extract-content/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.045882 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w7db6_19b5471c-7b90-4839-91c4-31e5f98c3959/extract-utilities/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.149287 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w7db6_19b5471c-7b90-4839-91c4-31e5f98c3959/registry-server/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.149779 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w7db6_19b5471c-7b90-4839-91c4-31e5f98c3959/extract-content/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.150488 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.265098 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-catalog-content\") pod \"81f09ea7-f543-4bac-821d-e8284c2685c1\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.265150 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5bqk\" (UniqueName: \"kubernetes.io/projected/81f09ea7-f543-4bac-821d-e8284c2685c1-kube-api-access-b5bqk\") pod \"81f09ea7-f543-4bac-821d-e8284c2685c1\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.265332 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-utilities\") pod \"81f09ea7-f543-4bac-821d-e8284c2685c1\" (UID: \"81f09ea7-f543-4bac-821d-e8284c2685c1\") " Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.266074 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-utilities" (OuterVolumeSpecName: "utilities") pod "81f09ea7-f543-4bac-821d-e8284c2685c1" (UID: "81f09ea7-f543-4bac-821d-e8284c2685c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.277966 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f09ea7-f543-4bac-821d-e8284c2685c1-kube-api-access-b5bqk" (OuterVolumeSpecName: "kube-api-access-b5bqk") pod "81f09ea7-f543-4bac-821d-e8284c2685c1" (UID: "81f09ea7-f543-4bac-821d-e8284c2685c1"). InnerVolumeSpecName "kube-api-access-b5bqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.288893 4906 generic.go:334] "Generic (PLEG): container finished" podID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerID="d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40" exitCode=0 Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.288955 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dffn4" event={"ID":"81f09ea7-f543-4bac-821d-e8284c2685c1","Type":"ContainerDied","Data":"d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40"} Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.288988 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dffn4" event={"ID":"81f09ea7-f543-4bac-821d-e8284c2685c1","Type":"ContainerDied","Data":"5b50945694a6efdffcef697db36a8e0b37fd156741a979dd5a60226581d87ec6"} Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.289004 4906 scope.go:117] "RemoveContainer" containerID="d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.289134 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dffn4" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.301061 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxzlw_3554650e-6043-4de9-819e-b4f063f7414e/extract-utilities/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.307409 4906 scope.go:117] "RemoveContainer" containerID="7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.324726 4906 scope.go:117] "RemoveContainer" containerID="5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.358241 4906 scope.go:117] "RemoveContainer" containerID="d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40" Feb 21 00:32:54 crc kubenswrapper[4906]: E0221 00:32:54.358989 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40\": container with ID starting with d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40 not found: ID does not exist" containerID="d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.359041 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40"} err="failed to get container status \"d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40\": rpc error: code = NotFound desc = could not find container \"d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40\": container with ID starting with d5cae995f943def250cb2050125ef38847c57a20bcedfcc67a3031f043e66f40 not found: ID does not exist" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.359083 4906 scope.go:117] "RemoveContainer" containerID="7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122" Feb 21 00:32:54 crc kubenswrapper[4906]: E0221 00:32:54.359462 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122\": container with ID starting with 7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122 not found: ID does not exist" containerID="7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.359509 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122"} err="failed to get container status \"7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122\": rpc error: code = NotFound desc = could not find container \"7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122\": container with ID starting with 7b13c5df677ab755f708ec08318f58901f7bcabaf9a8b6e9954a2fb6dce5c122 not found: ID does not exist" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.359532 4906 scope.go:117] "RemoveContainer" containerID="5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd" Feb 21 00:32:54 crc kubenswrapper[4906]: E0221 00:32:54.359838 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd\": container with ID starting with 5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd not found: ID does not exist" containerID="5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.359883 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd"} err="failed to get container status \"5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd\": rpc error: code = NotFound desc = could not find container \"5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd\": container with ID starting with 5d653188fc4bb56c2410d713a6b8439f448a1ce8c442f11e48634f897a8781dd not found: ID does not exist" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.366950 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5bqk\" (UniqueName: \"kubernetes.io/projected/81f09ea7-f543-4bac-821d-e8284c2685c1-kube-api-access-b5bqk\") on node \"crc\" DevicePath \"\"" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.366982 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.396193 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81f09ea7-f543-4bac-821d-e8284c2685c1" (UID: "81f09ea7-f543-4bac-821d-e8284c2685c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.468746 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f09ea7-f543-4bac-821d-e8284c2685c1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.495994 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxzlw_3554650e-6043-4de9-819e-b4f063f7414e/extract-utilities/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.513026 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxzlw_3554650e-6043-4de9-819e-b4f063f7414e/extract-content/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.524590 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxzlw_3554650e-6043-4de9-819e-b4f063f7414e/extract-content/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.616958 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dffn4"] Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.622510 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dffn4"] Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.696358 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxzlw_3554650e-6043-4de9-819e-b4f063f7414e/extract-utilities/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.731891 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxzlw_3554650e-6043-4de9-819e-b4f063f7414e/extract-content/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.881252 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxzlw_3554650e-6043-4de9-819e-b4f063f7414e/registry-server/0.log" Feb 21 00:32:54 crc kubenswrapper[4906]: I0221 00:32:54.991056 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h4zrd_951dfb31-b021-468b-bbae-cd6077c0cfdd/marketplace-operator/2.log" Feb 21 00:32:55 crc kubenswrapper[4906]: I0221 00:32:55.039032 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h4zrd_951dfb31-b021-468b-bbae-cd6077c0cfdd/marketplace-operator/1.log" Feb 21 00:32:55 crc kubenswrapper[4906]: I0221 00:32:55.520021 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:32:55 crc kubenswrapper[4906]: E0221 00:32:55.520536 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:32:55 crc kubenswrapper[4906]: I0221 00:32:55.525603 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" path="/var/lib/kubelet/pods/81f09ea7-f543-4bac-821d-e8284c2685c1/volumes" Feb 21 00:32:55 crc kubenswrapper[4906]: I0221 00:32:55.580923 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q75mj_af47c9fc-f15a-4557-bbbb-8426961d3fad/extract-utilities/0.log" Feb 21 00:32:55 crc kubenswrapper[4906]: I0221 00:32:55.805421 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q75mj_af47c9fc-f15a-4557-bbbb-8426961d3fad/extract-utilities/0.log" Feb 21 00:32:55 crc kubenswrapper[4906]: I0221 00:32:55.829676 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q75mj_af47c9fc-f15a-4557-bbbb-8426961d3fad/extract-content/0.log" Feb 21 00:32:55 crc kubenswrapper[4906]: I0221 00:32:55.831652 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q75mj_af47c9fc-f15a-4557-bbbb-8426961d3fad/extract-content/0.log" Feb 21 00:32:56 crc kubenswrapper[4906]: I0221 00:32:56.041559 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q75mj_af47c9fc-f15a-4557-bbbb-8426961d3fad/extract-content/0.log" Feb 21 00:32:56 crc kubenswrapper[4906]: I0221 00:32:56.042050 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q75mj_af47c9fc-f15a-4557-bbbb-8426961d3fad/extract-utilities/0.log" Feb 21 00:32:56 crc kubenswrapper[4906]: I0221 00:32:56.250570 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q75mj_af47c9fc-f15a-4557-bbbb-8426961d3fad/registry-server/0.log" Feb 21 00:32:58 crc kubenswrapper[4906]: E0221 00:32:58.518942 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:33:02 crc kubenswrapper[4906]: E0221 00:33:02.518625 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:33:07 crc kubenswrapper[4906]: I0221 00:33:07.516946 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:33:07 crc kubenswrapper[4906]: E0221 00:33:07.517651 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:33:09 crc kubenswrapper[4906]: I0221 00:33:09.287738 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86d9894984-bcvsj_93269518-8ff5-4e82-8f60-1a3382b87720/prometheus-operator-admission-webhook/0.log" Feb 21 00:33:09 crc kubenswrapper[4906]: I0221 00:33:09.305886 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86d9894984-2k8rf_b36dc052-4396-45b6-9169-1efee41b5e32/prometheus-operator-admission-webhook/0.log" Feb 21 00:33:09 crc kubenswrapper[4906]: I0221 00:33:09.317702 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-7lwz7_8600dc47-ea8b-4033-b9cc-fbb62f54e36e/prometheus-operator/0.log" Feb 21 00:33:09 crc kubenswrapper[4906]: I0221 00:33:09.466913 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-khgkm_bc85b339-0a0f-4745-88fa-8b48af27f6be/operator/0.log" Feb 21 00:33:09 crc kubenswrapper[4906]: I0221 00:33:09.486973 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-xbkjz_34cdaea4-84d8-4753-be71-0b73d7e9d1ba/perses-operator/0.log" Feb 21 00:33:12 crc kubenswrapper[4906]: E0221 00:33:12.517787 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:33:17 crc kubenswrapper[4906]: E0221 00:33:17.519847 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:33:22 crc kubenswrapper[4906]: I0221 00:33:22.516936 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:33:22 crc kubenswrapper[4906]: E0221 00:33:22.517399 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:33:25 crc kubenswrapper[4906]: E0221 00:33:25.526438 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:33:31 crc kubenswrapper[4906]: E0221 00:33:31.521602 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:33:37 crc kubenswrapper[4906]: I0221 00:33:37.517727 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:33:37 crc kubenswrapper[4906]: E0221 00:33:37.518532 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:33:39 crc kubenswrapper[4906]: E0221 00:33:39.523721 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:33:42 crc kubenswrapper[4906]: E0221 00:33:42.519655 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:33:48 crc kubenswrapper[4906]: I0221 00:33:48.517294 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:33:48 crc kubenswrapper[4906]: E0221 00:33:48.518046 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:33:52 crc kubenswrapper[4906]: E0221 00:33:52.519433 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:33:56 crc kubenswrapper[4906]: E0221 00:33:56.520024 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:34:01 crc kubenswrapper[4906]: I0221 00:34:01.517970 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:34:01 crc kubenswrapper[4906]: E0221 00:34:01.518514 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:34:03 crc kubenswrapper[4906]: I0221 00:34:03.786194 4906 generic.go:334] "Generic (PLEG): container finished" podID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerID="0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05" exitCode=0 Feb 21 00:34:03 crc kubenswrapper[4906]: I0221 00:34:03.786258 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" event={"ID":"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4","Type":"ContainerDied","Data":"0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05"} Feb 21 00:34:03 crc kubenswrapper[4906]: I0221 00:34:03.786903 4906 scope.go:117] "RemoveContainer" containerID="0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05" Feb 21 00:34:04 crc kubenswrapper[4906]: I0221 00:34:04.166865 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vt6l9_must-gather-z6gz2_695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4/gather/0.log" Feb 21 00:34:05 crc kubenswrapper[4906]: E0221 00:34:05.526989 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:34:07 crc kubenswrapper[4906]: E0221 00:34:07.519908 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:34:10 crc kubenswrapper[4906]: I0221 00:34:10.876578 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vt6l9/must-gather-z6gz2"] Feb 21 00:34:10 crc kubenswrapper[4906]: I0221 00:34:10.877064 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" podUID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerName="copy" containerID="cri-o://8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78" gracePeriod=2 Feb 21 00:34:10 crc kubenswrapper[4906]: I0221 00:34:10.921674 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vt6l9/must-gather-z6gz2"] Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.205240 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vt6l9_must-gather-z6gz2_695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4/copy/0.log" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.205976 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.321120 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-must-gather-output\") pod \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\" (UID: \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\") " Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.321218 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k52f4\" (UniqueName: \"kubernetes.io/projected/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-kube-api-access-k52f4\") pod \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\" (UID: \"695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4\") " Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.332990 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-kube-api-access-k52f4" (OuterVolumeSpecName: "kube-api-access-k52f4") pod "695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" (UID: "695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4"). InnerVolumeSpecName "kube-api-access-k52f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.382419 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" (UID: "695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.422723 4906 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.422758 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k52f4\" (UniqueName: \"kubernetes.io/projected/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4-kube-api-access-k52f4\") on node \"crc\" DevicePath \"\"" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.524563 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" path="/var/lib/kubelet/pods/695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4/volumes" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.836007 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vt6l9_must-gather-z6gz2_695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4/copy/0.log" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.836596 4906 generic.go:334] "Generic (PLEG): container finished" podID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerID="8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78" exitCode=143 Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.836661 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt6l9/must-gather-z6gz2" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.836652 4906 scope.go:117] "RemoveContainer" containerID="8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.854639 4906 scope.go:117] "RemoveContainer" containerID="0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.885489 4906 scope.go:117] "RemoveContainer" containerID="8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78" Feb 21 00:34:11 crc kubenswrapper[4906]: E0221 00:34:11.885901 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78\": container with ID starting with 8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78 not found: ID does not exist" containerID="8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.885937 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78"} err="failed to get container status \"8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78\": rpc error: code = NotFound desc = could not find container \"8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78\": container with ID starting with 8ae66f587489ec3537177f64fdb5e5fead073d40821e79f142f0b8e371669c78 not found: ID does not exist" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.885965 4906 scope.go:117] "RemoveContainer" containerID="0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05" Feb 21 00:34:11 crc kubenswrapper[4906]: E0221 00:34:11.886472 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05\": container with ID starting with 0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05 not found: ID does not exist" containerID="0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05" Feb 21 00:34:11 crc kubenswrapper[4906]: I0221 00:34:11.886593 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05"} err="failed to get container status \"0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05\": rpc error: code = NotFound desc = could not find container \"0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05\": container with ID starting with 0fa7a53994398e9e2df0c724d17fa077b09bc6290a95e4fd359328aab27e2c05 not found: ID does not exist" Feb 21 00:34:12 crc kubenswrapper[4906]: I0221 00:34:12.517068 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:34:12 crc kubenswrapper[4906]: E0221 00:34:12.517627 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:34:17 crc kubenswrapper[4906]: E0221 00:34:17.520232 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:34:22 crc kubenswrapper[4906]: E0221 00:34:22.519395 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:34:26 crc kubenswrapper[4906]: I0221 00:34:26.517898 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:34:26 crc kubenswrapper[4906]: E0221 00:34:26.518784 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:34:32 crc kubenswrapper[4906]: E0221 00:34:32.518579 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:34:33 crc kubenswrapper[4906]: E0221 00:34:33.520004 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:34:37 crc kubenswrapper[4906]: I0221 00:34:37.517797 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:34:37 crc kubenswrapper[4906]: E0221 00:34:37.518742 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:34:46 crc kubenswrapper[4906]: E0221 00:34:46.521546 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:34:47 crc kubenswrapper[4906]: E0221 00:34:47.519102 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:34:51 crc kubenswrapper[4906]: I0221 00:34:51.517442 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:34:51 crc kubenswrapper[4906]: E0221 00:34:51.518249 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:34:57 crc kubenswrapper[4906]: E0221 00:34:57.519564 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:35:00 crc kubenswrapper[4906]: E0221 00:35:00.519577 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:35:05 crc kubenswrapper[4906]: I0221 00:35:05.520896 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:35:05 crc kubenswrapper[4906]: E0221 00:35:05.521567 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:35:09 crc kubenswrapper[4906]: E0221 00:35:09.521024 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:35:13 crc kubenswrapper[4906]: E0221 00:35:13.519825 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:35:19 crc kubenswrapper[4906]: I0221 00:35:19.517605 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:35:19 crc kubenswrapper[4906]: E0221 00:35:19.518667 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:35:21 crc kubenswrapper[4906]: E0221 00:35:21.522572 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:35:24 crc kubenswrapper[4906]: E0221 00:35:24.518848 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:35:33 crc kubenswrapper[4906]: I0221 00:35:33.517595 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:35:33 crc kubenswrapper[4906]: E0221 00:35:33.518500 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:35:34 crc kubenswrapper[4906]: E0221 00:35:34.520352 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:35:38 crc kubenswrapper[4906]: E0221 00:35:38.519211 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:35:45 crc kubenswrapper[4906]: I0221 00:35:45.520072 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:35:45 crc kubenswrapper[4906]: E0221 00:35:45.522675 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:35:45 crc kubenswrapper[4906]: E0221 00:35:45.525249 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:35:49 crc kubenswrapper[4906]: E0221 00:35:49.519227 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:35:57 crc kubenswrapper[4906]: E0221 00:35:57.535830 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:35:58 crc kubenswrapper[4906]: I0221 00:35:58.517217 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:35:58 crc kubenswrapper[4906]: E0221 00:35:58.517606 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:36:03 crc kubenswrapper[4906]: I0221 00:36:03.519721 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 00:36:03 crc kubenswrapper[4906]: E0221 00:36:03.567410 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:36:03 crc kubenswrapper[4906]: E0221 00:36:03.567618 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8s9w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-l49m7_service-telemetry(6acb63f4-309c-49bc-b8e6-007db92e699a): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:36:03 crc kubenswrapper[4906]: E0221 00:36:03.568857 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.137323 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4q2t9"] Feb 21 00:36:05 crc kubenswrapper[4906]: E0221 00:36:05.137826 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerName="gather" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.137860 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerName="gather" Feb 21 00:36:05 crc kubenswrapper[4906]: E0221 00:36:05.137885 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerName="extract-utilities" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.137902 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerName="extract-utilities" Feb 21 00:36:05 crc kubenswrapper[4906]: E0221 00:36:05.137929 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerName="registry-server" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.137949 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerName="registry-server" Feb 21 00:36:05 crc kubenswrapper[4906]: E0221 00:36:05.137975 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerName="copy" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.137991 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerName="copy" Feb 21 00:36:05 crc kubenswrapper[4906]: E0221 00:36:05.138005 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerName="extract-content" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.138017 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerName="extract-content" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.138203 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f09ea7-f543-4bac-821d-e8284c2685c1" containerName="registry-server" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.138224 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerName="gather" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.138241 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="695d6cf5-e3f6-44c4-b2e9-d84ea9e7c7f4" containerName="copy" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.140025 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.156422 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4q2t9"] Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.314297 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk87z\" (UniqueName: \"kubernetes.io/projected/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-kube-api-access-fk87z\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.314350 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-catalog-content\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.314391 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-utilities\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.416241 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk87z\" (UniqueName: \"kubernetes.io/projected/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-kube-api-access-fk87z\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.416281 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-catalog-content\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.416312 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-utilities\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.417111 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-utilities\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.417218 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-catalog-content\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.448568 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk87z\" (UniqueName: \"kubernetes.io/projected/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-kube-api-access-fk87z\") pod \"certified-operators-4q2t9\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.474479 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.692222 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4q2t9"] Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.867871 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4q2t9" event={"ID":"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e","Type":"ContainerStarted","Data":"d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58"} Feb 21 00:36:05 crc kubenswrapper[4906]: I0221 00:36:05.867927 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4q2t9" event={"ID":"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e","Type":"ContainerStarted","Data":"224d4bd2422fb8c39a3a7ba3ab394e37961009ebe2f69f648ea11ca56f1a7e2f"} Feb 21 00:36:06 crc kubenswrapper[4906]: I0221 00:36:06.878432 4906 generic.go:334] "Generic (PLEG): container finished" podID="248fbd3c-e9c9-4224-b7c0-d6c51e6c754e" containerID="d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58" exitCode=0 Feb 21 00:36:06 crc kubenswrapper[4906]: I0221 00:36:06.878549 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4q2t9" event={"ID":"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e","Type":"ContainerDied","Data":"d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58"} Feb 21 00:36:06 crc kubenswrapper[4906]: I0221 00:36:06.878841 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4q2t9" event={"ID":"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e","Type":"ContainerStarted","Data":"5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35"} Feb 21 00:36:07 crc kubenswrapper[4906]: I0221 00:36:07.887587 4906 generic.go:334] "Generic (PLEG): container finished" podID="248fbd3c-e9c9-4224-b7c0-d6c51e6c754e" containerID="5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35" exitCode=0 Feb 21 00:36:07 crc kubenswrapper[4906]: I0221 00:36:07.887644 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4q2t9" event={"ID":"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e","Type":"ContainerDied","Data":"5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35"} Feb 21 00:36:08 crc kubenswrapper[4906]: I0221 00:36:08.898114 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4q2t9" event={"ID":"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e","Type":"ContainerStarted","Data":"3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf"} Feb 21 00:36:08 crc kubenswrapper[4906]: I0221 00:36:08.922603 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4q2t9" podStartSLOduration=1.225317586 podStartE2EDuration="3.922580244s" podCreationTimestamp="2026-02-21 00:36:05 +0000 UTC" firstStartedPulling="2026-02-21 00:36:05.869278613 +0000 UTC m=+1701.120866119" lastFinishedPulling="2026-02-21 00:36:08.566541251 +0000 UTC m=+1703.818128777" observedRunningTime="2026-02-21 00:36:08.919483536 +0000 UTC m=+1704.171071042" watchObservedRunningTime="2026-02-21 00:36:08.922580244 +0000 UTC m=+1704.174167760" Feb 21 00:36:09 crc kubenswrapper[4906]: E0221 00:36:09.519349 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:36:10 crc kubenswrapper[4906]: I0221 00:36:10.517562 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:36:10 crc kubenswrapper[4906]: E0221 00:36:10.517835 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:36:15 crc kubenswrapper[4906]: I0221 00:36:15.476145 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:15 crc kubenswrapper[4906]: I0221 00:36:15.476522 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:15 crc kubenswrapper[4906]: I0221 00:36:15.547457 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:16 crc kubenswrapper[4906]: I0221 00:36:16.025254 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:16 crc kubenswrapper[4906]: I0221 00:36:16.092382 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4q2t9"] Feb 21 00:36:17 crc kubenswrapper[4906]: I0221 00:36:17.974305 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4q2t9" podUID="248fbd3c-e9c9-4224-b7c0-d6c51e6c754e" containerName="registry-server" containerID="cri-o://3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf" gracePeriod=2 Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.396692 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.516275 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-catalog-content\") pod \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.516488 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-utilities\") pod \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.516558 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk87z\" (UniqueName: \"kubernetes.io/projected/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-kube-api-access-fk87z\") pod \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\" (UID: \"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e\") " Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.518446 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-utilities" (OuterVolumeSpecName: "utilities") pod "248fbd3c-e9c9-4224-b7c0-d6c51e6c754e" (UID: "248fbd3c-e9c9-4224-b7c0-d6c51e6c754e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:36:18 crc kubenswrapper[4906]: E0221 00:36:18.520226 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.525513 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-kube-api-access-fk87z" (OuterVolumeSpecName: "kube-api-access-fk87z") pod "248fbd3c-e9c9-4224-b7c0-d6c51e6c754e" (UID: "248fbd3c-e9c9-4224-b7c0-d6c51e6c754e"). InnerVolumeSpecName "kube-api-access-fk87z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.615864 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "248fbd3c-e9c9-4224-b7c0-d6c51e6c754e" (UID: "248fbd3c-e9c9-4224-b7c0-d6c51e6c754e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.618198 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.618246 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk87z\" (UniqueName: \"kubernetes.io/projected/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-kube-api-access-fk87z\") on node \"crc\" DevicePath \"\"" Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.618269 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.986873 4906 generic.go:334] "Generic (PLEG): container finished" podID="248fbd3c-e9c9-4224-b7c0-d6c51e6c754e" containerID="3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf" exitCode=0 Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.987240 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4q2t9" event={"ID":"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e","Type":"ContainerDied","Data":"3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf"} Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.987282 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4q2t9" event={"ID":"248fbd3c-e9c9-4224-b7c0-d6c51e6c754e","Type":"ContainerDied","Data":"224d4bd2422fb8c39a3a7ba3ab394e37961009ebe2f69f648ea11ca56f1a7e2f"} Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.987311 4906 scope.go:117] "RemoveContainer" containerID="3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf" Feb 21 00:36:18 crc kubenswrapper[4906]: I0221 00:36:18.987472 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4q2t9" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.014264 4906 scope.go:117] "RemoveContainer" containerID="5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.032360 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4q2t9"] Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.038373 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4q2t9"] Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.048123 4906 scope.go:117] "RemoveContainer" containerID="d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.069377 4906 scope.go:117] "RemoveContainer" containerID="3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf" Feb 21 00:36:19 crc kubenswrapper[4906]: E0221 00:36:19.070179 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf\": container with ID starting with 3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf not found: ID does not exist" containerID="3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.070271 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf"} err="failed to get container status \"3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf\": rpc error: code = NotFound desc = could not find container \"3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf\": container with ID starting with 3ebb3de7bd4a1857bdcf2924ac2242edc77c26bddb04bebd379a256457fb63cf not found: ID does not exist" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.070328 4906 scope.go:117] "RemoveContainer" containerID="5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35" Feb 21 00:36:19 crc kubenswrapper[4906]: E0221 00:36:19.070943 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35\": container with ID starting with 5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35 not found: ID does not exist" containerID="5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.070999 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35"} err="failed to get container status \"5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35\": rpc error: code = NotFound desc = could not find container \"5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35\": container with ID starting with 5c58946a27d36a593f9382b751fd49196729cf91acfb1a6f3f47d7e9ab495e35 not found: ID does not exist" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.071041 4906 scope.go:117] "RemoveContainer" containerID="d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58" Feb 21 00:36:19 crc kubenswrapper[4906]: E0221 00:36:19.071380 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58\": container with ID starting with d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58 not found: ID does not exist" containerID="d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.071412 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58"} err="failed to get container status \"d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58\": rpc error: code = NotFound desc = could not find container \"d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58\": container with ID starting with d6b08abce9b2ff1f5290dd85173cb074d144748ccc89d98ef0f7645faffd2f58 not found: ID does not exist" Feb 21 00:36:19 crc kubenswrapper[4906]: I0221 00:36:19.532434 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="248fbd3c-e9c9-4224-b7c0-d6c51e6c754e" path="/var/lib/kubelet/pods/248fbd3c-e9c9-4224-b7c0-d6c51e6c754e/volumes" Feb 21 00:36:20 crc kubenswrapper[4906]: E0221 00:36:20.569277 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Feb 21 00:36:20 crc kubenswrapper[4906]: E0221 00:36:20.569492 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxhgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-7gxrh_service-telemetry(2447e8f2-533d-4a34-8379-fa94b6bd6d4f): ErrImagePull: initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown" logger="UnhandledError" Feb 21 00:36:20 crc kubenswrapper[4906]: E0221 00:36:20.570841 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest: reading manifest latest in image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index: manifest unknown\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:36:21 crc kubenswrapper[4906]: I0221 00:36:21.517308 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:36:21 crc kubenswrapper[4906]: E0221 00:36:21.517883 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:36:29 crc kubenswrapper[4906]: E0221 00:36:29.519088 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:36:35 crc kubenswrapper[4906]: I0221 00:36:35.525226 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:36:35 crc kubenswrapper[4906]: E0221 00:36:35.525966 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:36:35 crc kubenswrapper[4906]: E0221 00:36:35.529853 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:36:40 crc kubenswrapper[4906]: E0221 00:36:40.519152 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:36:49 crc kubenswrapper[4906]: E0221 00:36:49.519421 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:36:50 crc kubenswrapper[4906]: I0221 00:36:50.517767 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:36:50 crc kubenswrapper[4906]: E0221 00:36:50.518586 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:36:55 crc kubenswrapper[4906]: E0221 00:36:55.523020 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:37:00 crc kubenswrapper[4906]: E0221 00:37:00.518730 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:37:02 crc kubenswrapper[4906]: I0221 00:37:02.516821 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:37:02 crc kubenswrapper[4906]: E0221 00:37:02.517828 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:37:10 crc kubenswrapper[4906]: E0221 00:37:10.520875 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:37:14 crc kubenswrapper[4906]: E0221 00:37:14.519320 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:37:15 crc kubenswrapper[4906]: I0221 00:37:15.520637 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:37:15 crc kubenswrapper[4906]: E0221 00:37:15.521221 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" Feb 21 00:37:25 crc kubenswrapper[4906]: E0221 00:37:25.527157 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-l49m7" podUID="6acb63f4-309c-49bc-b8e6-007db92e699a" Feb 21 00:37:29 crc kubenswrapper[4906]: E0221 00:37:29.524262 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest\\\"\"" pod="service-telemetry/infrawatch-operators-7gxrh" podUID="2447e8f2-533d-4a34-8379-fa94b6bd6d4f" Feb 21 00:37:30 crc kubenswrapper[4906]: I0221 00:37:30.518878 4906 scope.go:117] "RemoveContainer" containerID="6fc979229554d0d6557319058e84b2ee43f562aac81387cb471faf3829817c16" Feb 21 00:37:30 crc kubenswrapper[4906]: E0221 00:37:30.519514 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9qdv_openshift-machine-config-operator(17518505-fa81-4399-b6cd-5527dae35ef3)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9qdv" podUID="17518505-fa81-4399-b6cd-5527dae35ef3" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515146177127024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015146177127017375 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015146173163016514 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015146173163015464 5ustar corecore